<%BANNER%>

Autonomous Vehicle Guidance for Citrus Grove Navigation

Permanent Link: http://ufdc.ufl.edu/UFE0022364/00001

Material Information

Title: Autonomous Vehicle Guidance for Citrus Grove Navigation
Physical Description: 1 online resource (314 p.)
Language: english
Creator: Subramanian, Vijay
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2008

Subjects

Subjects / Keywords: autonomous, control, guidance, image, laser, robot, unmanned, vehicle, vision
Agricultural and Biological Engineering -- Dissertations, Academic -- UF
Genre: Agricultural and Biological Engineering thesis, Ph.D.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

Notes

Abstract: Automation and robotics is becoming an important part of agricultural operations in the 21st century. Exploration is currently under way to use robotics in citrus groves. An important part of robotics in citrus groves is the use of autonomous vehicles. The development of guidance systems for an autonomous vehicle operating in citrus groves is presented in this dissertation. The guidance system development is based on two different vehicles, a commercial Tractor and a golf cart type vehicle, eGator. Initial developments were based on the Tractor and the later developments were based on the eGator. The guidance system development process consisted of choosing sensors for implementing the guidance system, the mounting of the sensors and developing hardware and software interface for the sensors with the computer, the development of algorithms for processing the information from the sensors to determine the path for the vehicle, the development of steering control system to steer the vehicle in the desired path; the development of the software for interfacing the sensors and implementing the algorithms, and conducting experiments to validate the operation of the autonomous vehicle in citrus groves. The primary sensors used for determining the path for the vehicle were, video camera and laser radar. The important methods discussed in this dissertation are the determination of the vehicle dynamics of the tractor; the development of PID based steering control system for steering the tractor machine vision, machine vision and laser radar based path segmentation algorithms for navigating the vehicle through grove alleyways, sensor fusion based path determination by fusing the information from machine vision and laser radar, Pure Pursuit method of steering the eGator, algorithms for turning the vehicle at the headlands of the citrus grove using sweeping ladar, GPS based navigation of open fields between grove blocks and laser radar based obstacle detection. The ability of the autonomous vehicle to navigate the paths was experimentally verified in custom designed test tracks and in citrus groves. The development of the autonomous vehicle guidance system showed machine vision and ladar based systems as promising and accurate methods for navigating citrus groves. The autonomous vehicle possesses the ability to navigate the citrus grove alleyways in the middle of the path, turn around at the headlands and navigate subsequent alleyways, navigate open fields between grove blocks, and detect and stop in the presence of obstacles.
General Note: In the series University of Florida Digital Collections.
General Note: Includes vita.
Bibliography: Includes bibliographical references.
Source of Description: Description based on online resource; title from PDF title page.
Source of Description: This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Statement of Responsibility: by Vijay Subramanian.
Thesis: Thesis (Ph.D.)--University of Florida, 2008.
Local: Adviser: Burks, Thomas F.
Local: Co-adviser: Lee, Won Suk.
Electronic Access: RESTRICTED TO UF STUDENTS, STAFF, FACULTY, AND ON-CAMPUS USE UNTIL 2010-08-31

Record Information

Source Institution: UFRGP
Rights Management: Applicable rights reserved.
Classification: lcc - LD1780 2008
System ID: UFE0022364:00001

Permanent Link: http://ufdc.ufl.edu/UFE0022364/00001

Material Information

Title: Autonomous Vehicle Guidance for Citrus Grove Navigation
Physical Description: 1 online resource (314 p.)
Language: english
Creator: Subramanian, Vijay
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2008

Subjects

Subjects / Keywords: autonomous, control, guidance, image, laser, robot, unmanned, vehicle, vision
Agricultural and Biological Engineering -- Dissertations, Academic -- UF
Genre: Agricultural and Biological Engineering thesis, Ph.D.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

Notes

Abstract: Automation and robotics is becoming an important part of agricultural operations in the 21st century. Exploration is currently under way to use robotics in citrus groves. An important part of robotics in citrus groves is the use of autonomous vehicles. The development of guidance systems for an autonomous vehicle operating in citrus groves is presented in this dissertation. The guidance system development is based on two different vehicles, a commercial Tractor and a golf cart type vehicle, eGator. Initial developments were based on the Tractor and the later developments were based on the eGator. The guidance system development process consisted of choosing sensors for implementing the guidance system, the mounting of the sensors and developing hardware and software interface for the sensors with the computer, the development of algorithms for processing the information from the sensors to determine the path for the vehicle, the development of steering control system to steer the vehicle in the desired path; the development of the software for interfacing the sensors and implementing the algorithms, and conducting experiments to validate the operation of the autonomous vehicle in citrus groves. The primary sensors used for determining the path for the vehicle were, video camera and laser radar. The important methods discussed in this dissertation are the determination of the vehicle dynamics of the tractor; the development of PID based steering control system for steering the tractor machine vision, machine vision and laser radar based path segmentation algorithms for navigating the vehicle through grove alleyways, sensor fusion based path determination by fusing the information from machine vision and laser radar, Pure Pursuit method of steering the eGator, algorithms for turning the vehicle at the headlands of the citrus grove using sweeping ladar, GPS based navigation of open fields between grove blocks and laser radar based obstacle detection. The ability of the autonomous vehicle to navigate the paths was experimentally verified in custom designed test tracks and in citrus groves. The development of the autonomous vehicle guidance system showed machine vision and ladar based systems as promising and accurate methods for navigating citrus groves. The autonomous vehicle possesses the ability to navigate the citrus grove alleyways in the middle of the path, turn around at the headlands and navigate subsequent alleyways, navigate open fields between grove blocks, and detect and stop in the presence of obstacles.
General Note: In the series University of Florida Digital Collections.
General Note: Includes vita.
Bibliography: Includes bibliographical references.
Source of Description: Description based on online resource; title from PDF title page.
Source of Description: This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Statement of Responsibility: by Vijay Subramanian.
Thesis: Thesis (Ph.D.)--University of Florida, 2008.
Local: Adviser: Burks, Thomas F.
Local: Co-adviser: Lee, Won Suk.
Electronic Access: RESTRICTED TO UF STUDENTS, STAFF, FACULTY, AND ON-CAMPUS USE UNTIL 2010-08-31

Record Information

Source Institution: UFRGP
Rights Management: Applicable rights reserved.
Classification: lcc - LD1780 2008
System ID: UFE0022364:00001


This item has the following downloads:


Full Text
xml version 1.0 encoding UTF-8
REPORT xmlns http:www.fcla.edudlsmddaitss xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.fcla.edudlsmddaitssdaitssReport.xsd
INGEST IEID E20101221_AAAAAQ INGEST_TIME 2010-12-21T13:03:09Z PACKAGE UFE0022364_00001
AGREEMENT_INFO ACCOUNT UF PROJECT UFDC
FILES
FILE SIZE 31778 DFID F20101221_AAARVF ORIGIN DEPOSITOR PATH subramanian_v_Page_096.QC.jpg GLOBAL false PRESERVATION BIT MESSAGE_DIGEST ALGORITHM MD5
ec14829377af8d051a03ba12b5e58935
SHA-1
140c2ca8a21c16d412de86e236f1566953974035
25271604 F20101221_AAAPOH subramanian_v_Page_046.tif
bb8776b1db4048fcb1926b3c060475f9
3d15a11f50437f6b9c0b52fe96e80415fb328e36
83919 F20101221_AAAQLA subramanian_v_Page_255.jpg
cbc90b7449eb8d8c5a5752a632c55b9e
73a80bfe5067dfb0f1baed3ba3be796cef8ea07d
8478 F20101221_AAARVG subramanian_v_Page_096thm.jpg
439f6a6991b11f14bd35cd270f105721
71b5cb9f578bc71915bda1a533e1b819ea964977
F20101221_AAAQYM subramanian_v_Page_247.tif
72f4ef91cd98e05f3d050b1557029a26
60744f0d3bee3fb5b194187538096c335bda2f4a
10499 F20101221_AAAPOI subramanian_v_Page_017.QC.jpg
abc60971244313d3b397d7784e24798c
ee2ed9edb7e8eb60a5a2998355a90c1f2b056031
52122 F20101221_AAAQLB subramanian_v_Page_256.jpg
f61b4d1180a1583256f19b2b267a4d5a
8d294531120cee547ce2f2bdcd71c929c1d089b4
9451 F20101221_AAARVH subramanian_v_Page_097thm.jpg
1c7edeb70fe5bea59608b9465b166b1f
c2db8b8870d3c3c7fa3037ac88fe8e87475d3c56
F20101221_AAAQYN subramanian_v_Page_249.tif
efa97fd9c185ffd84bea81f0ec44dac9
efba31af2005d778e89ee7c5d8c85b3dd060a0a9
1051954 F20101221_AAAPOJ subramanian_v_Page_310.jp2
a2f6c882c3e26dc03bbab512f7987efc
62a19f6c236012194dccbb692bd6d7a6ba83fc4f
120629 F20101221_AAAQLC subramanian_v_Page_258.jpg
d719e3540ad134fdf8278949b2b67022
b8990afa6eb6f697903a81b0885051a3d0735cf6
F20101221_AAAQYO subramanian_v_Page_250.tif
118fe3465b1107ca92dad2589cb0c19e
af3c378a5861f3e9b0bbb2bc6f7b9b99b04ddeb1
F20101221_AAAPOK subramanian_v_Page_103.tif
7a919529da8549dd466542ee71c4466a
527fc35a9b3b95d2c3288180c09c585f60663642
93786 F20101221_AAAQLD subramanian_v_Page_260.jpg
e21c1042e08adca4287e183b2ea010b4
639bfb31b32dd25b219f95b5c418bb9511794a54
6679 F20101221_AAARVI subramanian_v_Page_098thm.jpg
037c71dead6e1896eca0a17c23b26a26
6f9030839ee47643360b4e7f35662d8bab117202
F20101221_AAAQYP subramanian_v_Page_251.tif
ac99de33e0d9d370c718402600b22786
699260bb63279aa1f2baa7939c14c949f60e1912
61986 F20101221_AAAPOL subramanian_v_Page_295.jpg
a2f68c29e52bfe841231ad4e1084cd63
d78803e42ea69d88c5c69dd67d7671e4c9173b18
61847 F20101221_AAAQLE subramanian_v_Page_270.jpg
fa6dcb19aee157be84513f6e3d3385b7
f1371a9d027186f1719700d18a8d848e01f706cb
475881 F20101221_AAAPBA subramanian_v_Page_079.jp2
42f11de029bd78c23668f608fe5c1a30
169148d391c9893df221c9f5f8f5debcf31dacfd
7337 F20101221_AAARVJ subramanian_v_Page_099thm.jpg
b00f3c93368eee520246ca1327b0fb63
8e84dce75a1c03df4a85c3b0bdfbf4f73f7baca5
F20101221_AAAQYQ subramanian_v_Page_252.tif
e7b606b839c55712d89cee87387230d2
e8343d710eb41172afe09bd543b6609df0986817
56053 F20101221_AAAPOM subramanian_v_Page_074.pro
2f4fd36b990e4fd4bd48e9a30f4977b0
d140a29ba2ab85617d560476cfb5818c4e9dd62a
35339 F20101221_AAAQLF subramanian_v_Page_272.jpg
9b3df5a99494153a8cfa1fb4cc8c2cd9
1d261e38b19428c940b10000c57f96b7b26d6489
53580 F20101221_AAAPBB subramanian_v_Page_296.jpg
36822676c9ec5fcd3bfe64a725a1a84f
00b0b128552b34e8e7e127b04eb105012bd9375c
19779 F20101221_AAARVK subramanian_v_Page_100.QC.jpg
5350d588c2d2e23846873073016b9d25
bd5c56f3ecea5b37c734eb086a9c8870f7bcdd46
F20101221_AAAQYR subramanian_v_Page_254.tif
5c864a8a926f1dc425509084b2f747ba
3b7e0348b7274bbfd85bf3b7c68702be7b4f91f4
F20101221_AAAPON subramanian_v_Page_305.tif
6f9f94a86075992c72475f4d8862f20b
481a99b7f1ba7220f49c5b9c3fa41323783cfe24
39890 F20101221_AAAQLG subramanian_v_Page_273.jpg
c1a36672592862343b3ea903feac3275
ccf44c727eedbab52c73ca9f459c93fbafc1ab19
7849 F20101221_AAAPBC subramanian_v_Page_211thm.jpg
9eaea29e67ed7a8f083c47fa8b018a55
029380d8feb08d4421c03de650abdf89b18f28cc
6026 F20101221_AAARVL subramanian_v_Page_100thm.jpg
c811d1fef5af3f70e338737cb621e72c
319660c6cde9f430dda765ec0c27a7c17ed2dcfb
F20101221_AAAQYS subramanian_v_Page_255.tif
6753f134bfe35c1603ef0414c3037393
fd2b41f1b3158eae4ff00ca117a8f934d79f8fca
31989 F20101221_AAAPOO subramanian_v_Page_008.QC.jpg
192ab9fc33d567657d9048f0ee64c280
f9dea3334493cfcf5578e34a622666bcea286bfa
33955 F20101221_AAAQLH subramanian_v_Page_274.jpg
7165e8e6a884aa488e6d909b951011ed
746c6c8296667a193d3613f95465c92ffe540675
2013 F20101221_AAARIA subramanian_v_Page_034.txt
6c1a9629b79f306e0916567daf071353
90f232a2d04efaff0cb5b2d5a598a60099f04f9d
32033 F20101221_AAAPBD subramanian_v_Page_187.pro
e0b3ac51b5fa4c88932a0c4dda898001
d110f805536accebe66f9003d5606275cedc44ab
29815 F20101221_AAARVM subramanian_v_Page_101.QC.jpg
782ef2da39d5e7946dec84850ef858b8
0caa63123d2d1b7176cfad75a7055d8171519f46
F20101221_AAAQYT subramanian_v_Page_256.tif
ba43c0e6c61b511ff323aa6fa78dc51f
b65f6ce4c314f0a9f7309034d41d6e0a5ea131e6
25492 F20101221_AAAPOP subramanian_v_Page_060.QC.jpg
e61b728d7590ee1bb64b63617b1a9bed
7b81b30d24d0201dbac3c2d32dbd3458dc9857bb
38744 F20101221_AAAQLI subramanian_v_Page_278.jpg
8d7c91745a8d9288bc947d49044d7390
df0ad1924ea246046a3fd4d4d6b929f97879b0b0
2275 F20101221_AAARIB subramanian_v_Page_037.txt
bb1a7aa079d2592f492ac79e488859e9
91c53f7e5d925ddeb7399b86751f5669406bc53a
6285 F20101221_AAAPBE subramanian_v_Page_177thm.jpg
81b9a7fb2b1a793484c70f6b693c35c3
7e6a2a7620f2666029e98221e15dc86df1aaa9dd
25917 F20101221_AAARVN subramanian_v_Page_102.QC.jpg
154d57b90bf2d04318154404821c446f
865d9a0e390082b779a9a555bc99ed18aaa47188
F20101221_AAAQYU subramanian_v_Page_257.tif
bf5cd44482ef8d6460d441d5bc97e9e4
45fc4f7e898c263f65a9a158be5546cfbc8b44cc
2214 F20101221_AAAPOQ subramanian_v_Page_047.txt
4c43f37af80c4dba2c24e8053059da75
1bc13df4b11e519cdc5b7ad27daa7b10dfaab79b
46524 F20101221_AAAQLJ subramanian_v_Page_281.jpg
426ac04f4a9b44ee4d85b17b7a63c967
7b59736c1a42b993434d9e445226690cf8001416
1851 F20101221_AAARIC subramanian_v_Page_039.txt
0ea91f306ddf609454595a350130905c
fa1f7521a3c5e9cf2468047cead74ce206d5603d
843 F20101221_AAAPBF subramanian_v_Page_226.txt
623afb7019a1c0d682c4820d4060c2e9
ee2c7294c388041135f110aa6620c3267beb914e
12922 F20101221_AAARVO subramanian_v_Page_104.QC.jpg
1c662d64656abb8595a39677d77c809a
214a868ef1ce8c14ec37ae7a11e1707d0ef04cf3
F20101221_AAAQYV subramanian_v_Page_258.tif
977bad27953fdd630e86edcde930030f
85ccc0cb8753d9f8061336b5ff13914c289fe782
490265 F20101221_AAAPOR subramanian_v_Page_100.jp2
06be1f7ed02c83ee6b2f3b4ec545d681
cd102a5e6d2e8de3a8a37c0ac616f03d4387bd58
60568 F20101221_AAAQLK subramanian_v_Page_284.jpg
cf4682f45e170245ffaa1a6355236b38
4985ef12d7a25dd3ec152da6d75f9676730928d9
1942 F20101221_AAARID subramanian_v_Page_041.txt
d0995a82c2adaa35198918f1b8dec002
999770094d811b18dbd31366bd121310e9ac1249
F20101221_AAAPBG subramanian_v_Page_172.tif
b210fae0d01294be850081184e26c12f
3bf5d37fd6cc13eff9e072e86ce5b21939659afe
3637 F20101221_AAARVP subramanian_v_Page_104thm.jpg
1fdc4fe1736c6ae6436afe728eb818d3
a6c91a90770071ebd812f3477e8a8189b945582f
F20101221_AAAQYW subramanian_v_Page_259.tif
4aeed6fce01afaf219ed5323aeb9d893
ef6478c3a8d102f249e3ddc33e9fcafe22ada690
33625 F20101221_AAAPOS subramanian_v_Page_169.QC.jpg
ada88c38887f6a90552e98578f2e135e
cb717d0294b6169e2d268a313a5a8b3896cb6059
42022 F20101221_AAAQLL subramanian_v_Page_285.jpg
0e8cefab82d079a9519d6949953674b7
54259798984618bfcb8d237e66d69742f9da1e68
F20101221_AAARIE subramanian_v_Page_042.txt
b31637ce1e6d49c776f127e365afafbd
125532fcf3293bc2fc5b0a349e461f17e09b6378
28260 F20101221_AAAPBH subramanian_v_Page_110.QC.jpg
de5215995a11859b7863ec7462316477
c738df7f1a6b831be4fa21bf2d6e5ab6003184f1
31159 F20101221_AAARVQ subramanian_v_Page_105.QC.jpg
88964fe09ab33d0fd3f33aa1526dc09f
b51e6bb8b8dc050cbd9b4b269d8944fd4531ab65
F20101221_AAAQYX subramanian_v_Page_260.tif
c49fc745147a18f14d7476d988a3ff2f
3681a1e18c826cee05576f9c140d20dbc5495a50
616676 F20101221_AAAPOT subramanian_v_Page_262.jp2
0a59745fa3136c2b8c40b25f54d8ca47
6e76cf3226bf02a30f8d9146fcae8f898f8916d5
62777 F20101221_AAAQLM subramanian_v_Page_287.jpg
7f96be7a3aa86779327050f1a6196891
ccfd3b05f7c352bf14d5ac1559d144446d478ac9
2064 F20101221_AAARIF subramanian_v_Page_043.txt
bb0acda728a51f27a9a2de70faeb3946
352ba05990b73e3df4acc864d0c29193a82dd114
7640 F20101221_AAAPBI subramanian_v_Page_052thm.jpg
95a1252e801a38c29de513c486d4ab45
e724d712e9736f97e03e19c4409c7151400990eb
38006 F20101221_AAARVR subramanian_v_Page_106.QC.jpg
116a3ae3fa4e9df1ff32ec99775e53ff
8bdd45c27ff090c4c4b50219fc1cb3a37607d3aa
F20101221_AAAQYY subramanian_v_Page_261.tif
e2b598afcf5941cec6d40ebf66ea1c4f
50224724fe000b02e1bfe9852e0bb4d5fb88f0b6
51929 F20101221_AAAPOU subramanian_v_Page_071.pro
1828711d8e800a648974d5e3766fbdce
5404f35d6dcd61d6dd0aaf51f5a87ded6b3f5fb4
55511 F20101221_AAAQLN subramanian_v_Page_288.jpg
fa58df51803971018390dc712eeb34ca
ae357feaa4dca17299d1dca9e095a5a680c70f27
1458 F20101221_AAARIG subramanian_v_Page_048.txt
331dc853b2543cf43c58e00088f9c91d
7c4cbab67f0fa0c9b717d7b8eb5d583c891d063d
97996 F20101221_AAAPBJ subramanian_v_Page_116.jpg
a830a01fd1c4432e86dd5231e3fd4468
4f45680fe645713e4c82511ab2ebd8ddd1e35bc2
9293 F20101221_AAARVS subramanian_v_Page_106thm.jpg
f5e55ef587f8c1db58898a47c05f30da
2c5317325daa0f0fc0390ffb3047af05c8e5e479
F20101221_AAAQYZ subramanian_v_Page_263.tif
cccbc5db5016609a7b60c5602b2ac23f
5d0d3561ae4c4472c7865c7bca7ed5a86751aadb
36528 F20101221_AAAPOV subramanian_v_Page_218.QC.jpg
49611fecab377470e623608a8246489c
2ed5613a9a10d141b529f53881c764087282a0b6
63965 F20101221_AAAQLO subramanian_v_Page_291.jpg
67b01cc086bb823a5cd18a60151aa58f
b06f699a12503099c74df37c36e8ae8efa9c2b29
2313 F20101221_AAARIH subramanian_v_Page_049.txt
a7a55be37d92e8218c104e4973228670
24d1f1c82ceaa0a07e7be99f328bffe2706c0165
1049 F20101221_AAAPBK subramanian_v_Page_292.txt
b4859785b1cdc60cefa1c388da1e91d5
0629d1347e330223e4b47155bcaeeeb9353d9b63
37801 F20101221_AAARVT subramanian_v_Page_107.QC.jpg
e0f12a226d50511ba4671889ebaeaca4
75660e5b14335054ad75de4405aadbb7f59f6731
F20101221_AAAPOW subramanian_v_Page_312.tif
795b1e5482acb2e86d6f896dd081e300
aad84fe73e4805403b0527ca60fc537972736896
46209 F20101221_AAAQLP subramanian_v_Page_293.jpg
7f077c79328053f8f872ccbfc6e7f3c6
4caba1b7a914a8cd54aa50fdc0a255f36dd765b2
1475 F20101221_AAARII subramanian_v_Page_050.txt
ea3d64129681e6263660f4eb4410ded2
36e86d81c576d22f06c5944b539d4bac16f528a3
7418 F20101221_AAAPBL subramanian_v_Page_255thm.jpg
666b43fa60ab75543cec046502ad5a24
6a9b7317a602c45d4943d0a90274a41c3b79ceec
28926 F20101221_AAARVU subramanian_v_Page_108.QC.jpg
bc4621f490efa20707eecbe452fd1b64
edd0bb51479115e80c8b14d97901a7ba99b52d65
605070 F20101221_AAAPOX subramanian_v_Page_130.jp2
2bea4f1b2434a40c4dde1ac5db935180
2ffe7fe3875a5cf6c6488d6efb841043eb71b067
52547 F20101221_AAAQLQ subramanian_v_Page_294.jpg
d42a51f160e33a1c61e839449dd12887
b75ee41392a6285e4122ca02f637dfcfd0529352
1936 F20101221_AAARIJ subramanian_v_Page_052.txt
e4d9e6d581ca9578351dc8f7a46af39d
7c2b9915d491e0b78ed67e6ab9887c785b261a21
F20101221_AAAPBM subramanian_v_Page_027.tif
187decf4f1490cfa9486ce1b69f8af73
494f234161faac8cc9b4061520e0ee31555c1e76
30832 F20101221_AAARVV subramanian_v_Page_109.QC.jpg
529c0aab8d7c66bf6abb70fcd63865ba
91a54d4a86b36b0482b25c162286a602fcd6f0eb
888528 F20101221_AAAPOY subramanian_v_Page_108.jp2
bacb6f6575c07bd7a1633797d8eabc1a
c1a6fcd8fdf9fe1aff08cf8b19c9031b20b45bd0
61688 F20101221_AAAQLR subramanian_v_Page_297.jpg
436d887e88301fff977a908638c389b4
6527d371f4cee123b1487907cf9735cbc7fef5f8
1720 F20101221_AAARIK subramanian_v_Page_054.txt
e6bd048d0abbd8e9d2cb8540659f2b46
6bc25e54330619250acc80780afd457a1a2ba932
F20101221_AAAPBN subramanian_v_Page_203.tif
5c653f59b6f35f3293f27dc4ece7c9c3
204cf8f8a119c3817d366a54121eebf951e171e1
7776 F20101221_AAARVW subramanian_v_Page_109thm.jpg
008eeaee1de5eccbb4df4497cd112986
7ec6f3dbd602cd28b66853dc38981d9903dfa4bc
2248 F20101221_AAAPOZ subramanian_v_Page_237.txt
fef29828571648314311c27caade7b40
75a767e6dd071b460ce83dfce94c34b48a98a8bc
49464 F20101221_AAAQLS subramanian_v_Page_298.jpg
4fc36b62bc5e712effae2e7b5c1d1f65
70de9d16430369ac0a7c44024e86d91bf7bc576c
2028 F20101221_AAARIL subramanian_v_Page_055.txt
62ff3c37d9af64ebd1eb573536a09f38
59e263d79e8e4e1477d4641ee95a218ee4f08b85
1051983 F20101221_AAAPBO subramanian_v_Page_106.jp2
4f1c3256041c12a171b5d698eac50dfd
cc7a275beab825b23a51d4bc5b01f01979bc38f2
7134 F20101221_AAARVX subramanian_v_Page_110thm.jpg
2f6fcc29ed46542e45244543291e8cee
27fa55cabefd5faf49095c19feec1f44ba133a72
53976 F20101221_AAAQLT subramanian_v_Page_302.jpg
b0496b2b3a9459242ec9fa9ee236ebb3
9e60da8f83f2cefd2545e63b8e05602483f9bbfa
1917 F20101221_AAARIM subramanian_v_Page_056.txt
6cb364a8a56ee37b3891800d41e4366e
3d6d783ed3967199d762d8c65aa77b44a2114167
2230 F20101221_AAAPBP subramanian_v_Page_021.txt
a26a5d3cdc7ea7834dd150b608757d63
edae63d3337d8145c45aedfe615c5794498e0fa4
7227 F20101221_AAARVY subramanian_v_Page_111thm.jpg
4686e642488ce404d777e0db97db9210
b5ad41360bde5afb77413e5dbb571cc57e9370cd
36629 F20101221_AAAQLU subramanian_v_Page_303.jpg
54854bee9c74a64ccd2b2692d33ebd86
c0f484e2c88dcb29da906e8a0aab313429e49041
1066 F20101221_AAARIN subramanian_v_Page_058.txt
85a0c8802d136ecdfd5e56239491f906
a3327b9543804a157fea14a206715c02ccc0ef4d
4590 F20101221_AAAOXA subramanian_v_Page_282thm.jpg
cc963cd16574eea1dec07643dae08b4d
b34581672a15b6f7701a1df0c09b224e0c33f2ec
57705 F20101221_AAAPBQ subramanian_v_Page_195.pro
87964d63c5c4a890c50aff27ac1154cc
2a281bc9db0efe9f06c1617e2ae454e230b89a9d
8359 F20101221_AAARVZ subramanian_v_Page_112thm.jpg
5c0910270eb5b3bc20c1432ffde33720
fc20ef184c10f6e18ff3df67286554f3f78937bf
54516 F20101221_AAAQLV subramanian_v_Page_304.jpg
50389b902d2e2879527752d5ca99b389
247c1041c48b4406e859c6c64eb03950d653eb75
898 F20101221_AAARIO subramanian_v_Page_059.txt
81db663aa9e1d2f5825550a58b9f2112
89361cc6f9b3aea5d9c5b7057a6f14b89c72065c
1828 F20101221_AAAOXB subramanian_v_Page_190.txt
436b10a02828d666f60a57ed3780ef49
c1b2351454b342821e54b612eace1b639641c824
F20101221_AAAPBR subramanian_v_Page_240.tif
7dcbd4e1d23bbb15616db3d27e1b0889
aa8acbcbf3f08c63f757db03fc4d6e6e49cbcdb4
47139 F20101221_AAAQLW subramanian_v_Page_306.jpg
a626fb06068195fe0d31b8e740093bcf
17cf643cb67e57d67efe6129ce34fb30bdb521fb
769 F20101221_AAARIP subramanian_v_Page_060.txt
bea8fe3b0520c5ed11c1ce3eff1bf99c
413be9ce499a551cf14b8f45527516b3c8d69b7f
91054 F20101221_AAAOXC subramanian_v_Page_174.jpg
26c79af5c40dbcc1117833994a287a9b
fab10425f54da2900710797cba77602845db0a0a
96192 F20101221_AAAPBS subramanian_v_Page_143.jpg
ee5ac1f0a55cd200cc4dfdaa921e01ea
1d5da70fd0d4c094d9d7fa72b8d7f52825fcc2bf
50489 F20101221_AAAQLX subramanian_v_Page_307.jpg
60aa97c95c86e62ebc5e551796fc5534
faeabe162d62066ad1bcd7d8829b126063638cb3
1140 F20101221_AAARIQ subramanian_v_Page_061.txt
00487132d1972c77bdf9705c1148eb74
e85457a34c975cdd48ff38e7ce535568e9466df5
27588 F20101221_AAAOXD subramanian_v_Page_069.QC.jpg
4dd1ae769f3384e30256b88cb1fefd16
584c66decb76ad629e5f8d3c53451bdf8a33c0a3
7583 F20101221_AAAPBT subramanian_v_Page_026thm.jpg
fc3e7ae632e12190bc80286b682d38fc
855cdaf3fb80498267d6c00942c2d4d6a54c0bf1
43758 F20101221_AAAQLY subramanian_v_Page_308.jpg
358320d3b12ab0179e5ad0cb2080abfe
b51f501bbde9b8b27ceea95729b933219cfb1b4a
1289 F20101221_AAARIR subramanian_v_Page_062.txt
b1ba8654d94dc977aee10075e32ed2ad
4d43140606774de1372c145897d8550f31f59088
36682 F20101221_AAAOXE subramanian_v_Page_121.pro
68aeb355d618bee34feb04c111815c1f
ed738b593f37b908082592fdd82bd368c9a6ce2b
17468 F20101221_AAAPBU subramanian_v_Page_278.pro
a65a26165b5c3849a419f68952140ca3
353a8c084414fd734d55b8aebe49b29eb107595c
129605 F20101221_AAAQLZ subramanian_v_Page_311.jpg
10f16b8aff13e22744423ae13b33b1d0
f2b36738ab3e646997a339eb5bc5571eea1a4bf3
1225 F20101221_AAARIS subramanian_v_Page_063.txt
fc633ce21a556c5d7f395134b12a3ba7
4d340fd0e7f3a4fd356343cc600bea71f4d667e2
107036 F20101221_AAAOXF subramanian_v_Page_066.jpg
8203d2c8ef2a1576cfcd49748d9f972a
616412eab08547f6e5ace5a0c64bbb8574f8d75d
40738 F20101221_AAAPBV subramanian_v_Page_125.pro
783543cc4f1c7c1ebb51bbbb0e4f39e5
c0b33350173b9a8511e2da0017a2c35dd62dc2a9
824 F20101221_AAARIT subramanian_v_Page_064.txt
fc8be221bf1b89fee7523bce1b595de0
678541dd59425cd3fdfd474452376718e56b7d04
1803 F20101221_AAAOXG subramanian_v_Page_126.txt
fa414983e5c44918b5f16f843eb0bcb6
0ccecaa0401a584bf0664ac78bcc4e8b55b5ce05
1115 F20101221_AAAPUA subramanian_v_Page_262.txt
2b872bc42589ba8e095855123b8f19c8
2e848b827fd1f1432c9b3a294f509eb8da430801
558 F20101221_AAARIU subramanian_v_Page_065.txt
45f3fe7ba06cf52ccb0cd64c4ed237ce
6a8e692455505ef728edc8a9170c74c94ab0ef11
850556 F20101221_AAAOXH subramanian_v_Page_157.jp2
db673a4c7bfe73047bb6eb6b0294300a
ad56352baa598994aeb8d79cd5df66f6fe29eb5c
24503 F20101221_AAAPBW subramanian_v_Page_048.QC.jpg
a57c1c26d0a84f0096337299a12f0d46
e796a8fc1dfc0a1f23efbb5c63a12b426809bce7
F20101221_AAAPUB subramanian_v_Page_036.tif
3a2427ea03a2a5b5936ce42afb881748
2734865466bd25deed53be855d09b0f7ad3370f2
1683 F20101221_AAARIV subramanian_v_Page_066.txt
f47717541ba74358a7d965e422d44e85
08d255e9519af0d070d7558f69875d987381590c
F20101221_AAAOXI subramanian_v_Page_266.tif
459097a21bcd8640a9aa5a6fe2e340d1
a8462197a63fd7b7598838ada3e304f7c4003d5d
39488 F20101221_AAAPBX subramanian_v_Page_032.pro
12634c7b38d979e17b571e86b5f234a5
c85bc4bea6137e901aa265f79534d38ba9ce172f
53646 F20101221_AAAPUC subramanian_v_Page_030.pro
dd91c4ff5df5e5cb15ead4000e3c1d15
336c1b76735825d7f85059432689bc518758cae1
1513 F20101221_AAARIW subramanian_v_Page_067.txt
bc0d35cbe69e2a9eee4279cc00f3b402
eab187bd6d16a6486b5cde6c604c6858598a504a
1104 F20101221_AAAOXJ subramanian_v_Page_271.txt
e7008ab986775136e5fc5045dd692214
8066dd576e9477bed2443e545f71c1399f3d513b
35214 F20101221_AAAPBY subramanian_v_Page_264.QC.jpg
c28104208a0a390c4373b1a6686be1e9
8f479db186927b1f32e3e558485dcbc419f1c7e1
7861 F20101221_AAAPUD subramanian_v_Page_055thm.jpg
f7f48b03819126e3b6331a3de4df85cf
11eaae0460918863cb6dc1bb6b8857f867e624d4
1500 F20101221_AAARIX subramanian_v_Page_068.txt
612886944d9b7928a3c2207cfaa9e4b8
ed3a35a4d3f93c2499f0163d21787430d85a3ee0
34654 F20101221_AAAOXK subramanian_v_Page_043.QC.jpg
ee89597edc5899d99401f78e21eb4898
e89731fe6373318c13a2f5be4af1350b3c81bb1e
90785 F20101221_AAAPBZ subramanian_v_Page_199.jpg
2aa1ff1c126275f005fa61ba2943e360
41f7fbf85c5fc88f8df5ba38d13a262cf6853a37
F20101221_AAAPUE subramanian_v_Page_056.tif
f2acbb87a99fe9f512e9270ef9b79fc8
608896d490b14560ab387018735b53e1fdc77d77
980 F20101221_AAARIY subramanian_v_Page_070.txt
ac071dab1928d50798e3ec6fb24ba684
3fcfbe5fa1cc45ad5c1e39f2da06bd1dd0fdc165
1191 F20101221_AAAOXL subramanian_v_Page_090.txt
f36aa3f0abd6fe0d8ca6b9d904917273
249625b528c4dc22570e91c4fb43622bf43a22ba
5064 F20101221_AAAPUF subramanian_v_Page_025thm.jpg
c364df70e1f796adf46dff0be440906c
30cd0a43c696b429350bc87759c289fd84bec196
2055 F20101221_AAARIZ subramanian_v_Page_071.txt
431d0630d3e9d3ac3adeba775ec2af2e
6b9a389b92f3aa6636fb9e4d660b2aaed9c91cfb
95687 F20101221_AAAOXM subramanian_v_Page_056.jpg
6b3854bf3c370ae85fc5c28cf64978bc
76069add819967cb4a0908210fa351fef8bc9db0
17206 F20101221_AAAPUG subramanian_v_Page_275.jpg
de7317341670cfdd6d6db74767b4bef3
5a77ffcfa0f7bd4b2b604061138397b2993d800e
F20101221_AAAPUH subramanian_v_Page_218.tif
851e7fb5f520de061336b41a01bc690f
38d781ebb67216ec36a44d20f44335c5c2264073
1051940 F20101221_AAAQRA subramanian_v_Page_226.jp2
8d7f917a892b9f57dc610dff38aaef95
fd3e4205fa75f917f1b5916e83d04694d8081070
1051958 F20101221_AAAOXN subramanian_v_Page_228.jp2
09acd3cf8c7daf28eeb67356db144f1b
e6044ce413d96ed10c7af7f1b545dc6485454e26
6575 F20101221_AAAPUI subramanian_v_Page_156thm.jpg
5cc58ac7448e097fd94600531e06c290
f2c9b0e3765b79dc8f010218e86461ca92e34468
905741 F20101221_AAAQRB subramanian_v_Page_227.jp2
40efb2529f29189c6417d1600ad4bd14
86c13b4b832f32b6989da5c88809c2a66d52266d
48855 F20101221_AAAOXO subramanian_v_Page_142.jpg
e949ec06faa0dc463c645ebb849f403f
df77a5c5655c44526df36a052efa1d85c335cd44
994501 F20101221_AAAQRC subramanian_v_Page_230.jp2
a0c26ad2b8bd676afed5c4b361142ffb
eb72095e4e4c7190f2c19ecdb457968cfe05d673
4731 F20101221_AAAOXP subramanian_v_Page_276thm.jpg
5eefd663308bd47a213de02b5bf5814c
c1f5d41600a57ec28d16852074d6c3ece74e5ad8
F20101221_AAAPUJ subramanian_v_Page_102.tif
14a4822eb7c15c749fd48bba825d6a97
793fd4c93142d2017fc2d5ffc5c7a347778a2ef1
825936 F20101221_AAAQRD subramanian_v_Page_231.jp2
58a53380f3ae441edd43977e5f1f6d78
7ccbadb38744bbfe8ffa7997352c4bdbac002e81
26969 F20101221_AAAOXQ subramanian_v_Page_209.QC.jpg
fe2db26ec5b3fad5d0e2d8154904950b
3b2bf60baa2214d75ad5f1807e590d55e2d36489
1051971 F20101221_AAAPUK subramanian_v_Page_008.jp2
975e55e6a995c86c5ed490514ede02fd
0cca193c9fda5c360d4f7843c44da1eca110faff
573292 F20101221_AAAQRE subramanian_v_Page_232.jp2
255d46b2086084a6c6c54e579d80ceb1
cca15c7e1eb2c2afdb3bf9b6158ccdc520855bcd
2604 F20101221_AAAOXR subramanian_v_Page_311.txt
33a10747f7fd755ee2d0d18255bcd9b6
206655d7588681df21f9c69e440fa7c4eebf49ab
2154 F20101221_AAAPUL subramanian_v_Page_046.txt
fb525dc349f413f31e78000a91996078
00ec59f94e4e7ee31709b51abd50342aed68ee63
1051973 F20101221_AAAPHA subramanian_v_Page_054.jp2
e7e884919b3727e39ba53bc2568566b0
999f3367fd39f670764a4c6c344afa7b33f741c7
37753 F20101221_AAAOXS subramanian_v_Page_195.QC.jpg
c5a80f1ba185f9574d879d1438bf5f8a
e8d49e6f6b93353127eba53118e5628ff7e60863
7603 F20101221_AAAPUM subramanian_v_Page_053thm.jpg
e020117933546689738cab212998280c
0394087b6f8698971a747b51a82ce538768e0893
969194 F20101221_AAAQRF subramanian_v_Page_233.jp2
13ebdb624c85e4c2840178f1fc8cb4c8
e01dcf9925747ebe54c74875181b4e303a040278
8048 F20101221_AAAPHB subramanian_v_Page_084thm.jpg
d66e2460fe15a814ef14b77c2ea62ce9
b940c24afd89dcf7f4ea4e40702f013601aa88e7
104062 F20101221_AAAOXT subramanian_v_Page_203.jpg
78965dce20a9a9c92b879ac16b5466f1
d0a910f3ac1c63119a007e6ecbfb9ed7af70f79d
1094 F20101221_AAAROA subramanian_v_Page_279.txt
75f17dadc92ddf6a1d9faece1d4617bb
8e6bef0702184971a17c16d03ba48eb312aa9995
904 F20101221_AAAPUN subramanian_v_Page_160.txt
36438d68872020a7a79e8c638e0594cc
899c9de093237898eb4891cf0d7fc18e475be91e
1051969 F20101221_AAAQRG subramanian_v_Page_236.jp2
7529cae5d5b6124a74ae49de5ad3c527
732f30e105f8d270ce1bc6ed68cf1cca41085797
32390 F20101221_AAAPHC subramanian_v_Page_011.QC.jpg
52fdc799598d7b5df24b134da41ade44
6efa261409d4fed0e042e9aaac56a6a040596d6d
27693 F20101221_AAAOXU subramanian_v_Page_061.QC.jpg
028c0ac553327e888457649dd96a7496
96b90be3722ca7c460db2fbf0d70d7921915860a
1530 F20101221_AAAPUO subramanian_v_Page_038.txt
163a8d931ca0173dcdfa198ba1017a1c
d42cfe2fb939b731e55cbb3c71206b5b569185f0
979647 F20101221_AAAQRH subramanian_v_Page_239.jp2
3e8aa2dd1b84a89288cf358c94e3edeb
dbd59c8c0f058d38c14b51daac3c45ce4b5a3f2c
F20101221_AAAPHD subramanian_v_Page_202.tif
005f62624bf23973a9d0e7afd2294ad0
78ead78ab2bbbbb6c9b24477d8a24a09c8cbc337
31916 F20101221_AAAOXV subramanian_v_Page_041.QC.jpg
c04ffaf879200e60e51d84c0d5418953
5978ac4a84f3f95a52745f56dced8b89ce929e90
1193 F20101221_AAAROB subramanian_v_Page_282.txt
d601621158f4fa3d0078a9426df17b6d
1f56ce0c2365eec356c7660953cfc181838b8264
F20101221_AAAPUP subramanian_v_Page_170.tif
bdec0dc5d3310aa813fdb0b6203d96a2
579414839ca30e4ebad12543456c9d0c864322e6
1051920 F20101221_AAAQRI subramanian_v_Page_241.jp2
b92be306e776b24f2f5ca3daee4cc0a3
ffee00b047300916459e8a45c08f4659cd4ab7a9
52404 F20101221_AAAPHE subramanian_v_Page_023.pro
7a45caeaf45343e9804fd01a7ccd1495
56c0d421400ab8299905df2c88e47d219d877e4a
643430 F20101221_AAAOXW subramanian_v_Page_270.jp2
8777338a2a8d66c35f076dcad16e18ca
da1231199feb38c5fbfe6c20a586f2c46d702464
1355 F20101221_AAAROC subramanian_v_Page_285.txt
5f47ac24ab965c5a326575b72387dfd4
2eabdc8557b2dbb83f74d585f64eef97a555ad9a
83671 F20101221_AAAPUQ subramanian_v_Page_077.jpg
c2dc9f72abd5a7a10bed3bb7717300a4
c0859523e9f6550d44aab880f8b8f513dcfc666f
1051976 F20101221_AAAQRJ subramanian_v_Page_242.jp2
c7eba329acb732431cbc76ebcc0c64a5
6b7678599b3cafd537d46336144ef90d2b917e94
F20101221_AAAPHF subramanian_v_Page_022.tif
1d921635ee4d3e1c2f568129b68ad2b9
272b875b10b788a726d44d2a1f4794c51ff649a6
452 F20101221_AAAOXX subramanian_v_Page_104.txt
3c49f42c5f2846ffde4de83b3fd8e737
d1e20ef6a93c857669f34b3a2bddfa031dabaa05
1517 F20101221_AAAROD subramanian_v_Page_286.txt
6d50b77b96c83590d2bf43ed7791fb28
baa88aa9e67344893cee9f69f094e12b176a92f4
22800 F20101221_AAAPUR subramanian_v_Page_082.pro
1012073a2a113693d76b50048d7d8d99
3ec2883ab648a192426337b7f1460ecbf9040825
1000516 F20101221_AAAQRK subramanian_v_Page_243.jp2
69a53f2b2df95cdbdd5c2ae663aba9dc
9f536970534cc2ca137ecffaa3a4c4c7a388ebb2
50447 F20101221_AAAPHG subramanian_v_Page_112.pro
d248e43d4e533a9d228cd210b91c707f
cfb24961a689400e39d653f40494aa037822ca9e
56519 F20101221_AAAOXY subramanian_v_Page_081.jpg
fb8b4a200b3c9dbfe2e4141022e804f3
4aed9afb37d1a97225c74e427ab1ce13461753aa
1419 F20101221_AAAROE subramanian_v_Page_287.txt
f5dfdfbeb2cbddbcb700c4e845e12c5b
55f31878d2dbbb36810e16d62b7c2997c9752b64
1800 F20101221_AAAPUS subramanian_v_Page_101.txt
dcb78169b5869fdec14c83b2e0d1ff3c
799a4e4fbc7c9c471068ceae5ae08f846291faed
1037578 F20101221_AAAQRL subramanian_v_Page_244.jp2
882295ffa9c2c35d8f33bd4157d60af1
c8c278662766cbc6bf3b23d217321e13297c8fa7
F20101221_AAAPHH subramanian_v_Page_129.tif
818e9ff9103e19176fbdac8a1129c4e1
6634fa47627401a91425a762c2a97c8889230275
1224 F20101221_AAAOXZ subramanian_v_Page_188.txt
f47c7b9e7db58bbb8d8936b3f75e0b59
4cfdabb3d53f2a828f68032bbc75190d337e4a24
90519 F20101221_AAAQEA subramanian_v_Page_246.jpg
dcad7552cb824fa3ac457e084cb10e50
e1f7787d40d7c06e79817369fba193557c522d11
942 F20101221_AAAROF subramanian_v_Page_288.txt
4274eb8f54388c90b9d103b73120b747
86950ec6f9da94c4912a06a7e5a5278203dde8ad
38889 F20101221_AAAPUT subramanian_v_Page_072.pro
ce20c2fc81c1fb6e52e41068eeff79cc
a478bf6279432c4ddb6208cb020bc3adb194d297
1051979 F20101221_AAAQRM subramanian_v_Page_245.jp2
7c2bc8e214cddc12128bd315f5a140db
49ef09bf55241c59128a3bce2c6a165ba3ec9f46
4337 F20101221_AAAPHI subramanian_v_Page_306thm.jpg
0cbbf75b71e5dc9867a05dac9941adab
e69087fabe006b111869e005dae17e368d5eac62
33845 F20101221_AAAQEB subramanian_v_Page_004.pro
aa57d5e9ddfdc1984c1f405687fb80fd
407cc760c0b4c26d422cc31f35e0aab920b79bfc
1451 F20101221_AAAROG subramanian_v_Page_291.txt
bb4a7ffeb5e8ccb38a0b5651bda1a988
a63eee3e63a3e12c28b748161965239b03a538dc
49249 F20101221_AAAPUU subramanian_v_Page_283.jpg
b04df4c7234cf30736479633c0db3e5e
2f54cb987d9bcd28ea9db18fa5d335dd0e15bd38
1051964 F20101221_AAAQRN subramanian_v_Page_247.jp2
2d68732b46bde0ba71383a87e26b92b8
1569c9a527aaa3998c6ddacb6112362a586dcee8
8266 F20101221_AAAPHJ subramanian_v_Page_169thm.jpg
e5e78c8aeb961f64d9b99431e83e6e17
65bfa9a85c570b8de2c662e22b3cee59ea119830
56665 F20101221_AAAQEC subramanian_v_Page_115.pro
ae2cfbfcd2f6703d3510089fcaac35a9
15f78996e5e38a5ad06f2f2d8fe2485167b8cc4c
990 F20101221_AAAROH subramanian_v_Page_293.txt
4ab6fb93eb4d4e1e57ab1fa4f9721000
7a76386a38a8ca71e91c86c0c751d88493d787d0
7577 F20101221_AAAPUV subramanian_v_Page_213thm.jpg
44217b141fc2f166259fd1ea0482f03f
3c41cd8b1217d72b5b61b45c7aa6dd5bbced9763
1051970 F20101221_AAAQRO subramanian_v_Page_248.jp2
5045eb5b57313f8a87fabd82f5ea78d0
526ba17f21c19afe6bae2be8f9ebc89ee220139e
17825 F20101221_AAAPHK subramanian_v_Page_295.QC.jpg
af8fbcac07fb64bcab1ed332d4d8a420
4c3f39e9e0004e8b85a8029cca62df9e35bb7785
F20101221_AAAQED subramanian_v_Page_126.tif
706478024fa0914cb2559c9b527e9319
a5c667ab54fca7f500dbe56e3c07d9952ceebca2
1381 F20101221_AAAROI subramanian_v_Page_294.txt
f31e26f9d2725e388f106f39d6dbdc74
49ab93bf73a978e66af8d2a2793d8ef495a8f4dd
F20101221_AAAPUW subramanian_v_Page_090.tif
a5d0dc2d3189dcee93a13d3708c6385a
3ad7726b91979d4dfb33e390e39eb90e39302a39
689389 F20101221_AAAQRP subramanian_v_Page_251.jp2
e562a4ce5685e0633e88538e51472f35
b21a230fada05a4ae70146c65ec40837dce80fa1
89791 F20101221_AAAPHL subramanian_v_Page_032.jpg
9ff2e878af65a16207de933433b1d180
8fa43ed1829d46c9f3b697c5e7221123cf3e9a1c
117620 F20101221_AAAQEE subramanian_v_Page_261.jpg
0f8fc3117ba373b05b50b2e145b29325
ab17d207b22a294f1864b0c1e46da0986ef68567
1628 F20101221_AAAROJ subramanian_v_Page_295.txt
c553fe20cec767cc9a77c66da35b9b99
6549ac28c20467c91f553e20f7e91f0d373d6a23
516284 F20101221_AAAPUX subramanian_v_Page_279.jp2
97318ea85e6117304b852505e1eed95b
47f354076017c51da133db67cbde2cfde2306d6f
848791 F20101221_AAAQRQ subramanian_v_Page_252.jp2
eca2edc0156d7aea272c0522b8dfe0d9
493f0ac0364ecdc6f7022068895408e33a6c0b1f
F20101221_AAAPHM subramanian_v_Page_223.tif
4b2e268485b9611a5a001e18514b3676
4be18ce79fc290bf2afbe694ce95c5f3ca963e4e
44934 F20101221_AAAQEF subramanian_v_Page_126.pro
18b92ef76521a2e1f2e97aa3a7c39847
a6f6439bdf312853598c963401322ca4d04fde74
1360 F20101221_AAAROK subramanian_v_Page_296.txt
4451cc2e32dc89ac59ab426976a4823d
1ea7526671284f8c4854b25eedc3db549d0ab2fb
29076 F20101221_AAAPUY subramanian_v_Page_111.QC.jpg
7343ae6ecfb89f47f1b9290caf106302
e3bcbd49d913af04806c6d65c563af2a1aa27c4c
697031 F20101221_AAAQRR subramanian_v_Page_254.jp2
7ebe0db730caac554fdd09ffc175c060
f9d0dcd4fbf92eb99dacf0d06c5096dab38ca427
28728 F20101221_AAAPHN subramanian_v_Page_084.pro
1853dd52a36cee52d8efe1b84ada034f
0db3db6d63e30a2e17fe27963b8584cdadc87ff2
F20101221_AAAQEG subramanian_v_Page_209.tif
432aab155848feec3bfdf1962d78c298
33eba686b2481d9c916cb625da4d986e0e1845a6
1747 F20101221_AAAROL subramanian_v_Page_297.txt
08e8addd360573afb6d57235df51d123
48e642151babbe75eca1b53a0fcac5fbd887346d
1086 F20101221_AAAPUZ subramanian_v_Page_029.txt
d9871975b089ae4f98fad1eff7f52d25
536fd524a888968aa135f630d9f3420cbd556280
35483 F20101221_AAARBA subramanian_v_Page_036.pro
3e9d4ee90d54e24ef43432f61ff3807b
f99912f2e57640515118735902d0fc2f332af542
901459 F20101221_AAAQRS subramanian_v_Page_255.jp2
8acbf883f2fc30f98bb0fffb4c636c48
343a0a4ecfc2d1c5bbd0d26d51cc3fbac8a5ab7f
6708 F20101221_AAAPHO subramanian_v_Page_060thm.jpg
6e28aa30c49479c3709b33f5c82fd52c
570361e703600e3c46588de5b8675b3a9842bca2
2588 F20101221_AAAQEH subramanian_v_Page_012.txt
5275059b79d624727ea57e2b82234285
24319535283c83d7b22e5fd3f0e250950120c10f
1630 F20101221_AAAROM subramanian_v_Page_298.txt
ca526ef8fad86b5afbe7633a18da1031
05a653371b489a4960d2a0c09da8fb6bf0eb9c33
57531 F20101221_AAARBB subramanian_v_Page_037.pro
1288618562e8d74b650432708e8d2832
ee8d1a811f3f00a7ff74bbaa993ce153509d5ed5
1051962 F20101221_AAAQRT subramanian_v_Page_259.jp2
ca7cc839ee4cabb353841b001861c8e1
50ded31bb512ef884b73eeeb8e9965549ca4d1c5
F20101221_AAAPHP subramanian_v_Page_145.tif
d113b53e0206cbe01e35b3b18bd92345
2bfb00286800976222f3b20d57a3fd8f59fa542e
36776 F20101221_AAAQEI subramanian_v_Page_259.QC.jpg
f052d1b94202550de083959ba905749a
626295d1d875b56b7d18aba710051ce96fe0f6be
F20101221_AAARON subramanian_v_Page_299.txt
e754ba0ee5dd4b080a9acd6be4707368
aee7629335d2f631ddf13ed763f4641c06189c7e
45876 F20101221_AAARBC subramanian_v_Page_039.pro
6d716ebc54adf335cf57b05807d4d47a
05cebd18dbb0934ecc0157a6ba07e6a4f92a3ef3
1051986 F20101221_AAAQRU subramanian_v_Page_261.jp2
61a2cf82e90a246f1222f92bed2a5210
ef5f038e8c8b3f8fde111729874fc9719e11cab6
1051978 F20101221_AAAPHQ subramanian_v_Page_224.jp2
9b60c60a4a0f23aa3cf239a9663c7824
ff955a91ed53a594d1596353e0c7535ed659fd10
8631 F20101221_AAAQEJ subramanian_v_Page_191thm.jpg
5669287b6552cbf7221ff500ba2bc83a
3c2926be9483240058ac0cdfd21d6e7623243e9b
1535 F20101221_AAAROO subramanian_v_Page_300.txt
4deacf92f3f5fc634f7f319d3d4ee1c8
f15887288f6857f27cf08ec317b46f46551990ca
49098 F20101221_AAARBD subramanian_v_Page_041.pro
e590bc45f9e4eeebf018792a57121fa4
c7fbad989253ee2d2e25c8bd73738905cdbb58cf
1051985 F20101221_AAAQRV subramanian_v_Page_264.jp2
f6a322ebaab9c37528ff5b37514f9a07
980f7aae3a7eac12ed0eb2354cd002300bcae1ab
28966 F20101221_AAAPHR subramanian_v_Page_062.QC.jpg
ab613cfe9b6fa8055831112a4f29918a
24b071c22b6d4d65e50114cb39901fe7a84aa366
57904 F20101221_AAAQEK subramanian_v_Page_225.jpg
7643ca68ca9f1bafe5807c95ffdf3c38
ed0a4ad2af1c9ce7124b70e3e49005bb640894b5
1734 F20101221_AAAROP subramanian_v_Page_301.txt
5fe14a1d7b0fe2451165d7cd42a64f43
3a2cbe994196cd64112d8755344e78abfb0136a2
51023 F20101221_AAARBE subramanian_v_Page_042.pro
f36a8afab68e6403694f35e2ebf6d95d
266378ae855c82b3dc80adc39600e49a1d669eb0
1051906 F20101221_AAAQRW subramanian_v_Page_265.jp2
9b141d588616d3385315624109e90f0c
f6e693cfd262b74e8de1e3538e8028c39d772c1c
F20101221_AAAPHS subramanian_v_Page_219.tif
c6a162ad27bd4c3e3a39537936018927
c783d285903035b4f4fdcc00b41aa8b0b325d0cd
76900 F20101221_AAAQEL subramanian_v_Page_053.jpg
743be7e31dbfc485b357c93353d4790b
6e75cef79aecaf968b2bd7b37ffa159c15b34245
1406 F20101221_AAAROQ subramanian_v_Page_302.txt
83e5e6c2593d941330bde41c21d9d10e
782d45ef4bb713efd071920e27790e7445526e6a
52223 F20101221_AAARBF subramanian_v_Page_043.pro
fdcbe48541b6f0514b059348c5a3a750
7d884dbed01020dfb7f8ddfeb85bce4655511309
1051984 F20101221_AAAQRX subramanian_v_Page_267.jp2
48f3855f17cdcbb99261e0b068629ed0
2aabad92bfc4ba184698b2debd5787ca6a0e74fb
F20101221_AAAPHT subramanian_v_Page_064.tif
6fb9e297551d306d642a3a339fddea24
2c42fde8994f5daca57bcff1d3d124dcd4ecc79a
90301 F20101221_AAAQEM subramanian_v_Page_257.jpg
22d4097d2ec297bd71169da8afb89c36
ef86245dad488161af77249e2c2caf2525c1c571
1497 F20101221_AAAROR subramanian_v_Page_305.txt
4265e5a0b9bf81c41f0bb4f60b3ccd04
4f63d90d6d47acfd276e68e4d3dd907473079b58
49288 F20101221_AAARBG subramanian_v_Page_044.pro
549943bdbef00180eb0693c1c80f0588
bd519ef554e14b60962421ac1602683c244d6829
1051908 F20101221_AAAQRY subramanian_v_Page_268.jp2
dfb83be69d1af2cd67bea13b3396ce34
4d606603fc9940fc5664319472694a2eaa865ba7
1297 F20101221_AAAPHU subramanian_v_Page_035.txt
30bc15930a5f38fe3fa1446e9d2c2fbe
89abc711d57fff52acdd5266ea0f33e1aa8f504b
89285 F20101221_AAAQEN subramanian_v_Page_205.jpg
424c5c0ed823117b5d5868d6cf6ece1d
e141644a941a90c5a9af2df8ed9ff6a68fd17e21
958 F20101221_AAAROS subramanian_v_Page_306.txt
2e4cb15589bd03cf0fd5233e7d1d2087
20ceccad610811da41aa8d191ab62e0f825dace4
54742 F20101221_AAARBH subramanian_v_Page_045.pro
9aba8a12e07124184ffcaee68eb98529
cae8c9ffa2fc1cd770be24a41f2b4e43a06a5908
582058 F20101221_AAAQRZ subramanian_v_Page_271.jp2
cc2fae79ac8b62df76f6821153fb9121
337c794c2c47023678db2263d4fad7872bfcd518
1051981 F20101221_AAAPHV subramanian_v_Page_012.jp2
1dceec94a0dafaeb85af30a40fe2510f
214205045d5d10abcecf007e27785452dcefadb6
F20101221_AAAQEO subramanian_v_Page_117.tif
5c351bbddc4d496edefee3907b11165d
dc1eb890360a9a381bd9390efde2fb24abb8555c
1274 F20101221_AAAROT subramanian_v_Page_307.txt
46ce132fc09add3c7329a576800ed5ab
ce21e3584af8c57b1af69395ba9e94df99593898
54793 F20101221_AAARBI subramanian_v_Page_046.pro
d2e93d9beffe8c28a5adf682fe151e17
08ce881ff9f838abfcbc658dafe638b479879039
50636 F20101221_AAAPHW subramanian_v_Page_247.pro
a3b3d589aa9123d7c7012fde19fd15ad
2d037e562c8d468e8f2b674b69ea6b8b0ccc8d9e
37582 F20101221_AAAQEP subramanian_v_Page_212.QC.jpg
b4bd3a5ef436b2357bc5ffc37911ed60
56f3ba8b6a795c6972268df51d22b063d92ec148
1197 F20101221_AAAROU subramanian_v_Page_308.txt
c8bb104884dd5d8f62085b301a3cab92
5d7c12435bb347db4359a90f5e2a1314f075c54a
55684 F20101221_AAARBJ subramanian_v_Page_047.pro
59d06a71af0deb3efd419c116f6f7015
d00956fb55aefe99e4bcfed17d7910f1948c88bd
12127 F20101221_AAAPHX subramanian_v_Page_272.QC.jpg
42a5f3c63cf8e9172714b6caef66dbee
ffab86608a380369c6114013f63b66e52e0ec306
59700 F20101221_AAAQEQ subramanian_v_Page_130.jpg
c4770a7594eb57b59276e251ff8a90c1
0c6865e739f5588e67fa94a53289091ed79a35ed
2519 F20101221_AAAROV subramanian_v_Page_309.txt
8f1ab49085b4e86260d7e6b30b1470a2
8a89d17878358c43c911857070b39e6ef7b16563
F20101221_AAARBK subramanian_v_Page_048.pro
fa7353d6c961746f498c24c4e70737f6
51d389c02dc8e4ce600ee293c57539128f042480
F20101221_AAAPHY subramanian_v_Page_199.tif
68e2885eebdbc9935226cf38e76c43b5
5dafbdeec6c16f13b664a9d859c686b948729525
97428 F20101221_AAAQER subramanian_v_Page_185.jpg
0b92e008822e770087188df3a4968b21
f74ed6ba08b18b373f8e26b835f94d47dc853e4e
F20101221_AAAROW subramanian_v_Page_310.txt
aabf19ec513dc2e71d259c8d2673afd8
2b43ebb44ec3423d3443a6199fc2f4902a3526fa
58310 F20101221_AAARBL subramanian_v_Page_049.pro
a7255b2fc30d1ca046ded885b2d20966
94d2681df37f9239a91055732078929bd85043e0
445155 F20101221_AAAPHZ subramanian_v_Page_292.jp2
e015450e87e032b31587fc8f418a54f7
f27d4f262cf633d53d68f8aa7db131edfb2cc66c
4833 F20101221_AAAQES subramanian_v_Page_295thm.jpg
afa10f982f8682d8e57d43f316c611a0
ac1024dde0a2f2cda6e0a6a4ea96257512339bb7
2350 F20101221_AAAROX subramanian_v_Page_313.txt
2661933de8d01d240b1dbd8169a69387
83f6ecc5f218ef90c65c98ab596c2a7c8e60c7eb
26746 F20101221_AAARBM subramanian_v_Page_053.pro
5cbfc8392336a91aaf1b79d266d20654
6584065c89cbf2889353478a06a900bedb335277
40079 F20101221_AAAQET subramanian_v_Page_059.jpg
6dbc85feaeddbc81dbf63efc82d59006
bcda429e176d65a85435a011b8bd888b591cfca9
522 F20101221_AAAROY subramanian_v_Page_314.txt
c4b9d4b908df608372ea1a89c5c64b44
ae489d60f5245d5f55ef60e4264562d40ce7c324
42987 F20101221_AAARBN subramanian_v_Page_054.pro
a05d69d57b3c647984c2ca059f891d40
1f05915dee5579b37430087cc5f2f1f4f21490d0
7147 F20101221_AAAQEU subramanian_v_Page_102thm.jpg
a9c950405ba728fc7fff77b9635eed39
b5064ae5d32c4345a685764b017a5ec76db540d5
1865 F20101221_AAAROZ subramanian_v_Page_001thm.jpg
9eb4c4b128af133b5a900f0c86549634
49463bd5c5194ab0d9c18c49b20e308f502faca1
46848 F20101221_AAARBO subramanian_v_Page_055.pro
3ff883a3c58ad35d6d3a8682983f0bec
2f9db6af222fcd8d5eb78c4bfcc8cdcf63561158
90533 F20101221_AAAQEV subramanian_v_Page_201.jpg
86023c351cf5de18c72766c0dd06ff3e
0b8940cc6556ba7efc9ae58f035cd582e34eb984
48356 F20101221_AAARBP subramanian_v_Page_056.pro
5f203e051d06c6872311d9c25eef23f5
1ab9b88adb71e45449a3fb5f9d1227ec78da3f8c
7814 F20101221_AAAQEW subramanian_v_Page_188thm.jpg
11fce6b7143ff80ede648b7ace392169
e27a987210b9b33f5e5d4d7c1df9c8df3fd7b809
F20101221_AAAQXA subramanian_v_Page_184.tif
5c4ec5f426fa6c784d39a929082eb2aa
8664e159e18e775406735bf9ab0efe201f55d0b3
13822 F20101221_AAARBQ subramanian_v_Page_059.pro
adc86e482789fc0107a5e6c01fcef0b5
034adaeab21ea9c73fd2efe78ddbcca7abf77d22
8427 F20101221_AAAQEX subramanian_v_Page_190thm.jpg
65d3457084c1239f1161e601c9d13ac0
49231d1450ab7e98b6d7efaca5f38932c34292e6
F20101221_AAAQXB subramanian_v_Page_185.tif
1d822f6ff23a089565010c93c6a5d696
780da3a92ba816515b3bfafc280bc66427db5e65
17125 F20101221_AAARBR subramanian_v_Page_060.pro
9dac1eb4dfb8644638ba6ecb511e1866
dc37d779136bfe78f7c842184fee568e88a0260d
F20101221_AAAQXC subramanian_v_Page_187.tif
295bc3f5b002255c851d40294357c0d4
666b69a31ff22ab00c45527839a398d734dbdbe5
29875 F20101221_AAARBS subramanian_v_Page_063.pro
c33f55592bf0dff816b7f898486da451
a237720de7440f7585edef2f38ba45aeec8cdab7
25624 F20101221_AAAQEY subramanian_v_Page_153.QC.jpg
95dd91a32c5dc5c0a4a307cdc7679a20
5bcd26c13934ee2cce61879d9c92488b4a17e3b3
F20101221_AAAQXD subramanian_v_Page_188.tif
c3c5e81ddcf8350063adebff1b43f021
21842ea33c324a706b8603b71a0ecb9aa61e5006
18768 F20101221_AAARBT subramanian_v_Page_064.pro
b6b63be9e29163976817af9eae33119a
9d3eb84e2d087f02509903c8857c716eed36f173
1312 F20101221_AAAQEZ subramanian_v_Page_231.txt
a56d90a619cf2fbcea9d833a704d4aed
e184c008cec5e8596a218a887788afe1476c97ba
F20101221_AAAQXE subramanian_v_Page_189.tif
038abf88df8fb0242274af061ef3d608
f18ec09c6df85052260be9a673e2a184a447df00
1241 F20101221_AAAPNA subramanian_v_Page_181.txt
e9ee14fce9960f31ed18211213fe393e
4151007a82bfefe405febe7eee04e392a13ce8fc
F20101221_AAAQXF subramanian_v_Page_190.tif
7c8fe3f6b1c60965cb6093a1a9f46adf
2908f3a8fa0cc96a2cdfd5f692fc66cf373b4f07
12676 F20101221_AAARBU subramanian_v_Page_065.pro
942f6492b217890ee5a51a5f3f3f0ac2
37d1339b92f3e8bf7aa7435c6cfd3ed3a00f47b6
993229 F20101221_AAAPNB subramanian_v_Page_260.jp2
c1fe787fc74caace3e3cc019d6f98f90
de16f85c0cbc37465135439e633dacf62f32fca5
F20101221_AAAQXG subramanian_v_Page_191.tif
a20ddecf779f2a7251246de22586d978
5cb53ef6cb9efa5b7e1df5b86cb5f29c12139776
37299 F20101221_AAARBV subramanian_v_Page_067.pro
3b36c52cc59f7c951289290db2056f5d
17231fc28d75e3ff6bd88f24e802ae4d375afe9d
F20101221_AAARUA subramanian_v_Page_070thm.jpg
dd6299ea250a2f8f253c037c98ed90e4
4d65a41aa93990a5c28761f45813ae8a3e430e51
F20101221_AAAQXH subramanian_v_Page_193.tif
457c5c5537dd3cf10129ee04dd268342
dd4790b082ba3f231d90be0ae3f2850c760d2f43
33751 F20101221_AAARBW subramanian_v_Page_068.pro
29b37483cd7c77ff31cd3ff81a86a7cb
61c67f005eaf5d5045877901a741dec798cc6602
15937 F20101221_AAAPNC subramanian_v_Page_300.QC.jpg
7e4a16def168f8235fe3408692e1387e
be59075f66fd2d385016b333b7b0ddc4c97c7f5e
29541 F20101221_AAARUB subramanian_v_Page_072.QC.jpg
83b54b4a086f65b7aa2c8967abe01926
a5e0e6f3db655006b5c75d8a551f32c133807105
F20101221_AAAQXI subramanian_v_Page_194.tif
1f144991a4551f7e3f406b109847c09b
7f116e42bd0a3e3d05473388e864a8b381497101
26049 F20101221_AAARBX subramanian_v_Page_069.pro
48268a1810db19652ee083a9993ab908
1ac2cd4915f5999f711130c4975eb135feccd4b4
989 F20101221_AAAPND subramanian_v_Page_091.txt
92010566a4ac24507d4922a19532bd31
0ed72951bcfb536801422176e26b48368894de74
7569 F20101221_AAARUC subramanian_v_Page_072thm.jpg
33c076fc08933bbbb4a38a3f99424e76
bd174cf331e9126253e6dcf248a7517d373f74a4
F20101221_AAAQXJ subramanian_v_Page_197.tif
9efa9756542d6ff7b518bbf6ef50ec10
4f23dcdd4730992594c85b970b4e7f6b1b18067d
23578 F20101221_AAARBY subramanian_v_Page_070.pro
c8dbcc0f1fa7e2ad97a7c7eaf53e08b5
a06f4eb31bf1abc6e2f9dabf21ac8e9ffd90b91c
229826 F20101221_AAAPNE subramanian_v_Page_180.jp2
d27c4d7469022ce58040be2910129d1f
649f17c26ed1b08e5278d3f9f16fe2fe3cb9e31e
8052 F20101221_AAARUD subramanian_v_Page_073thm.jpg
8ad56ee58bc2d4446eb9c1a7c45298a6
2dcad97a8c256f9cc5c5f0756e55f5524835fd26
F20101221_AAAQXK subramanian_v_Page_198.tif
0fcb9c95665ec9bff937f34d6401c26c
92d86d3d2f0fe83c6edd10844bcf3f89c0157bd2
25722 F20101221_AAARBZ subramanian_v_Page_075.pro
584eb6e887c26f4a0a803557a8066a09
7ce39779f74aac4ba02b11bce3bce6b39cfb23ee
43475 F20101221_AAAPNF subramanian_v_Page_117.pro
f66048a3a3f2b22daf10a5e33d833c73
ab83454b4001401c51f22ab98cb01ef25407133e
37119 F20101221_AAARUE subramanian_v_Page_074.QC.jpg
a5b2bc3fe3dc18a52d8c9fbdfd8e2783
8cd853523b75ff8810e7849d28670cdb2c69a251
111996 F20101221_AAAPNG subramanian_v_Page_263.jpg
0d956b90cbe82febeda5251b0a44bb47
8c425148800738a2368536416308c9d60d55990f
5705 F20101221_AAARUF subramanian_v_Page_076thm.jpg
f3710244f4809f2a5726278ab8d7f964
7aa939b40a43ac34b9a2d2db537dfa6471043c33
F20101221_AAAQXL subramanian_v_Page_200.tif
28629459f472ec71ce6050bcb35485dc
35bbfa319b34434e4c3ad2b3e3e1c3c5a9c49175
1150 F20101221_AAAPNH subramanian_v_Page_304.txt
f8d404f08bc27b029702f938f56231fa
38bf8200722aa52694fbc44485b1b218e8769f28
97259 F20101221_AAAQKA subramanian_v_Page_211.jpg
8fae1687af05677af3fd328cca50c6e1
0e83706bd285212adddb42b67a637a62d4c2183b
27053 F20101221_AAARUG subramanian_v_Page_077.QC.jpg
3373ab3f1a5763569693cbe9983fff93
f20e0622874433b1c7f5f4978dbf8938636489e8
F20101221_AAAQXM subramanian_v_Page_201.tif
a5177d42e6eaca4a85e0776016ecdc0f
2eeee80ac86134b5bc940e1fc9848cd4ea17426c
8379 F20101221_AAAPNI subramanian_v_Page_024thm.jpg
90b0bf1f375c4f20b7a88607a8883505
ac2ae9121c0be13a2d609487e08547b999526e2f
112840 F20101221_AAAQKB subramanian_v_Page_212.jpg
ec5dd739567e214e3406f3057a027548
f1a5c24fde613b8ff7a787ad06ee54acdc9783e4
F20101221_AAAQXN subramanian_v_Page_205.tif
a7b01e633e8351418ec553eb4ef4dbd5
0db0ba426dd686535c6694b0c8abf55c475da769
98983 F20101221_AAAPNJ subramanian_v_Page_127.jpg
92b3de3a8913abe122c4ef4c991c8246
3c4cfde4834c7e3096254034d626b00b8bdafbce
96530 F20101221_AAAQKC subramanian_v_Page_213.jpg
d55cf0d4c618d669ae5ae44b471319dc
3f7e7bde34ec0906c0b9feda49620018eaeb658e
6850 F20101221_AAARUH subramanian_v_Page_077thm.jpg
36c3d0e2ca8ef6665422525fde9082cc
2cea36f5bb040ddc64f8665b4ca9a884ae835ec6
F20101221_AAAQXO subramanian_v_Page_208.tif
6dd0accadb43d03adb2995e00077853a
ca904be19298a4a3826a8a279c1370f5d237ef75
93878 F20101221_AAAPNK subramanian_v_Page_190.jpg
114d6d3c2cb39b40e531258960c433e0
dc0eb4aa3e290a9f8a33c1214b48857f16cb9c86
82692 F20101221_AAAQKD subramanian_v_Page_214.jpg
55c03871c94f75437b6f77ef2046f626
9409f8d2f1d2f4d233b55b8a9d5d9f9d05aafa9e
21936 F20101221_AAARUI subramanian_v_Page_078.QC.jpg
c56f277fe9ff71ef98b320c1a16e18cc
a1f13820c4a6bfcc7d8e2bb02c5e582739ebe0df
F20101221_AAAQXP subramanian_v_Page_211.tif
7b8127504893fe9ffa5d18b2f917c2f4
f0c3f1d4129b8bae12f3f510b7ee404166079fa2
172053 F20101221_AAAPNL subramanian_v_Page_275.jp2
bb7ef2776863cd423510cfb11ea7b9cd
b57a24ff36a0536df3f555dd681cacfb72a354c4
93161 F20101221_AAAQKE subramanian_v_Page_215.jpg
4ecd0a178dc087552b39046f26f6b0ac
d5afe73632cf34dcf95edcbd85f496a29261091c
862270 F20101221_AAAPAA subramanian_v_Page_064.jp2
8f96886f64143efb47fc3119eb809572
0646b37769600eaa1cd949dadf232255e3b2e1df
6186 F20101221_AAARUJ subramanian_v_Page_078thm.jpg
c6d03a7d0769160be0d0380434fe4264
90fb06b0503034e25a306da5683df4eff938c059
F20101221_AAAQXQ subramanian_v_Page_212.tif
aafd7dbd9e30ba59bd205ee64c1d1f6b
7e37662d34197fae06df3549db8235be063f76b9
F20101221_AAAPNM subramanian_v_Page_211.jp2
d5a2e0286cb5d0de31053646b2f21c4a
7a7c01bbe36d54ed1725341267ef78ea36fc6854
51322 F20101221_AAAQKF subramanian_v_Page_217.jpg
4ea10fed40383b6acdab2543618a5935
26819a3a553eeca83260414d12d68fede03a465d
293628 F20101221_AAAPAB subramanian_v_Page_314.jp2
09e7cc8935fb300d353fad7b796d4368
ad384dda2d12f805d932d42578094df84a0f9760
26241 F20101221_AAARUK subramanian_v_Page_080.QC.jpg
2bea66db8f27ac1bcace7006b8c90a52
4130a4a4fd10f2af4cbeba4d4b3c2b6f89996c37
F20101221_AAAQXR subramanian_v_Page_213.tif
f71da983d834cb774665bf0cd93579e8
f4e4d812aac355d9f35211b80b5fb607059a612a
F20101221_AAAPNN subramanian_v_Page_062.tif
d17ad6435c83ca3dcd7ef218ecefb048
3f6f2bee782cf67062ef24b4c4d68bf6a55b2818
113393 F20101221_AAAQKG subramanian_v_Page_219.jpg
db0ddb318115fba0b0120702fa03064e
284b5a8c70a537d8e0b79e1741c533590dc92ab1
51066 F20101221_AAAPAC subramanian_v_Page_103.pro
fe43b142fe0bc65480767cbb6bf504be
9bc4dfe699a06bfe3c488d26ffdb3ec9ff86c985
18215 F20101221_AAARUL subramanian_v_Page_081.QC.jpg
ce14a6636dc9ab4d31a6458e87ad0570
2c0c94c24eff6441722b959ce6e8342649fa8ad2
F20101221_AAAQXS subramanian_v_Page_214.tif
d1468f2a70a61b5f363686a68bb54b00
a49b74decf0a937db26d24b38b8a0efbf3ac90d1
23567 F20101221_AAAPNO subramanian_v_Page_091.pro
abafa25c013cc9cc25112a7db0070419
3de6e09c4400cab46ac8e954bca7f63f9468574d
56049 F20101221_AAAQKH subramanian_v_Page_220.jpg
174be67e0d7629aabeae4868211f04c7
253e604a072837e6c65ceaef448ca2935277fa5e
23417 F20101221_AAARHA subramanian_v_Page_300.pro
0219b1618fd1cd0d648e4bf7f3aecb3f
7a50dc703504d1768903d05a40bbcf11acd4c07e
50184 F20101221_AAAPAD subramanian_v_Page_300.jpg
c7322b5881d4216bc206be24ff87e2c0
bdb3868c12ade17351c56f614c3f3a8d550bb7c6
5445 F20101221_AAARUM subramanian_v_Page_081thm.jpg
f212bf7648a78e21438bbc57f3aa8aff
72161e6f1c75072212daf222ac034be54046ec45
F20101221_AAAQXT subramanian_v_Page_216.tif
21f9e254a3a7523a9f0570d48cd69e7f
aa17f478f117f48c90afc5d39de29b6e11e8a210
29194 F20101221_AAAPNP subramanian_v_Page_131.QC.jpg
5b81e17479f73752920726c9cb7d75bc
e1d1ac4f4265e2ed35ca5a931f96f8d71d996365
68636 F20101221_AAAQKI subramanian_v_Page_223.jpg
ae76c6a460b3c3b803914bbdad5e4ecf
40ef14be116508ce663cbb86d3dce883040f0c1f
25268 F20101221_AAARHB subramanian_v_Page_302.pro
46a41c36128d0b23fcf267f93e3c7f02
b6a80c4df39e3e6f81cd34ace2d41233fd9347b9
10730 F20101221_AAAPAE subramanian_v_Page_158.QC.jpg
ecf010a3649863716c3ddea03f8fae41
b4426d2b07351961d46ee2912a72e65ac550232f
17343 F20101221_AAARUN subramanian_v_Page_082.QC.jpg
df51fd482fbdb4a909b7fcd36743a6a5
8509d36553cc4c683edb26716b17d2f511a10e55
F20101221_AAAQXU subramanian_v_Page_217.tif
2f7618d1d2388ad2c60ff041530d8082
543204c4af566208f46305477c061268ca57a8ed
71333 F20101221_AAAPNQ subramanian_v_Page_070.jpg
6cb3ed0ad74f602d61ac375eaba06f21
33bf6e5b9438dcca8f8e477437e4029a24331ca3
94142 F20101221_AAAQKJ subramanian_v_Page_224.jpg
6fc6f344495cc268b5fe3e2fd3095cb5
4aae87aa92cdcbf962cc063fbb3a2a70e31177ec
15692 F20101221_AAARHC subramanian_v_Page_303.pro
881e8702fab4926f36b58707c24d2238
e863cc77734929c0d6a390f4c93698b93835fdfb
1856 F20101221_AAAPAF subramanian_v_Page_201.txt
21318b7298ea26652ad919644e94e956
fc2edbb859c9ed11a5825b98e540534ee4af276d
6106 F20101221_AAARUO subramanian_v_Page_083thm.jpg
0ac393ffb04bcbc34eef9577dd4333c5
03c2b2164e917d96948c6c5860a4718172184fd6
F20101221_AAAQXV subramanian_v_Page_221.tif
9c8385f1c2412c80e66e2a7edef1bc48
e5a5ca89f3870305f175ad58303638c993dc2d3a
19814 F20101221_AAAPNR subramanian_v_Page_277.pro
c2bfa87c877277d8c8ee9194ae8b686f
2977e79548e34cb2e78cd1bf36153b9dc03565b9
74560 F20101221_AAAQKK subramanian_v_Page_231.jpg
ac75128afe4823eebf4f3c2e96aa8eee
8109ec65b7de71922a35b2109fc01a1a4a7d3d27
33728 F20101221_AAARHD subramanian_v_Page_305.pro
16a6e27ff28f82d88c05cd375d117888
3082cee70dea8838a4e77e29561adbfafa9af20f
F20101221_AAAPAG subramanian_v_Page_076.tif
04ce77b84ff849f7587576ba4eb513c5
dd6a5bc67d32c0d0d0a4943fa36ee76480f698c2
29718 F20101221_AAARUP subramanian_v_Page_084.QC.jpg
e68f13cf27b1a502a04fbaa410e70097
63464d155de995a3a063a4a16865b213f740ee2b
F20101221_AAAQXW subramanian_v_Page_222.tif
8fc26653fed67c2652933641beaac66d
6e439986354efa18dd23a8f99e53388a9298bf3a
1051966 F20101221_AAAPNS subramanian_v_Page_034.jp2
4eeea81707b6ae53bcc2a74e63218e57
5a5abb89214a27689bf22afe087fac94f964e0c8
55487 F20101221_AAAQKL subramanian_v_Page_232.jpg
14e46fe0bba19325ff70e45cb7578826
200d98e9c5f3c478a85a2e1c43c6f4279ff54550
20677 F20101221_AAARHE subramanian_v_Page_306.pro
28e88cbf4dcaea94e92f67c0b496c48d
f3314d384e437584f14c15e44c8bcdba0bdf3c0d
96434 F20101221_AAAPAH subramanian_v_Page_207.jpg
fded7c7c5a7d5b5300d9b857cacf974d
a9dba370752488ccb909d8f258c88eea68239bcc
29465 F20101221_AAARUQ subramanian_v_Page_086.QC.jpg
33a0a72679d4bb75ab545592b270412d
90c1bcae54809d7550d6fd1d99727f2594b4818a
F20101221_AAAQXX subramanian_v_Page_226.tif
384eaaa830b24729a057797284366c7b
79ee7695a467ede78f8eff0d18c8a8b4020a0609
4467 F20101221_AAAPNT subramanian_v_Page_304thm.jpg
f4484c3ac0c58d639ccfe99805881d3e
aa56d9aec5e55f01862208aada3170ff7157af5b
87037 F20101221_AAAQKM subramanian_v_Page_233.jpg
473d05266f5467af976cc5380e3b002b
81dc18dc442116c01a8116401c8c6025eabf422f
24518 F20101221_AAARHF subramanian_v_Page_307.pro
f77e43111374454449b989a569405136
afed351cd9f37fdf42755fb111db8c005719b251
2170 F20101221_AAAPAI subramanian_v_Page_045.txt
0a479c9a01a9c86645770561b84435c0
a4436ed1a49fe40548344768476edd2905d85519
7720 F20101221_AAARUR subramanian_v_Page_086thm.jpg
b3411769f028edec690ae7d5db4ba053
9419afcdd3997671e2dd11acec49cb951269d5b8
F20101221_AAAQXY subramanian_v_Page_227.tif
6ae6b1678d777f1e1cd7daeb410bb74f
6aaeb82dc5af5e57bf61662e52fce8f2dc04fb22
36088 F20101221_AAAPNU subramanian_v_Page_194.QC.jpg
491cf0e9973cb09b15df795cb688a794
e662a0ffc2d3b49f46bbc7d01b41095d75fe7aa7
94932 F20101221_AAAQKN subramanian_v_Page_234.jpg
acd61583f9e0df2c786bc254b2b48fdd
91615df566dbf0c64fbf3a56a59cfa1ce37793d4
62442 F20101221_AAARHG subramanian_v_Page_309.pro
2e9e9f9b27296f8f9219537bf7d8cc41
beb1ae9bbb0223b2f6cdf4269cafe0239dc44637
F20101221_AAAPAJ subramanian_v_Page_179.tif
f312100e77e7ff890c7c3e9ec346c5a6
d84a4794b7062f6880ddf9ee12a3896c63c1b103
32779 F20101221_AAARUS subramanian_v_Page_087.QC.jpg
3c3171c1090f6a937fc668969337d4ad
42d5b25b30045a7a7ef73898211f20388227286c
F20101221_AAAQXZ subramanian_v_Page_228.tif
cafa1c35870febd5128b5d056c532474
cd1fbbb08deca532b842dc75794db4236ea90d54
7704 F20101221_AAAPNV subramanian_v_Page_085thm.jpg
3c7411e2966b0a1c640cb3aeb9df55ec
6471b1e3ac70b3e6c4d270980db5e22deeefd52d
105855 F20101221_AAAQKO subramanian_v_Page_236.jpg
736c22564f7166112666675b45c16c23
8f27169a4f646655bbd456f2bd4ae7019a5239b3
62618 F20101221_AAARHH subramanian_v_Page_310.pro
283cf42dba711291ac6b5915b90f7d5f
972e2045bd0512095411a266df3cf18a74a7bc57
F20101221_AAAPAK subramanian_v_Page_073.tif
bc6f3d609fb1d675057de7ef861849b5
e3663a7fa1dbef8d12c811aab574557237daa1ff
8630 F20101221_AAARUT subramanian_v_Page_087thm.jpg
65f453bbe3c0694ce79ec02d760bb1e5
4d6b06fc205effb1b77e34a1ba9b0bac6340848a
21050 F20101221_AAAPNW subramanian_v_Page_142.pro
47303094745662fee504cd8ab5f90250
bc3f7cfba7cb64b71f8fc65abc8e87cbe22eb4f8
98160 F20101221_AAAQKP subramanian_v_Page_238.jpg
e017048cefd32a3e70b0ccdf3647c55d
3be2af362c539b496535521c1c32451bd7426d39
64648 F20101221_AAARHI subramanian_v_Page_311.pro
c7e2f8ece1ae57f784760d019e0db3df
72de0a01565bb0a9c49f7c6df039bc7dc8af4b01
87592 F20101221_AAAPAL subramanian_v_Page_230.jpg
b79a1a5e7321b08abbaf6f7dbf400397
90db9a03045559402f0890f4de56d1a3b3e4c3fe
25278 F20101221_AAARUU subramanian_v_Page_088.QC.jpg
119df06c8200815470721e2f7749165c
3d50c331264c5ed1d2111b1a8f0526f705eed802
1505 F20101221_AAAPNX subramanian_v_Page_040.txt
527eb70d79d2e7318beb0297dabced6e
4162905778914e30bc1ae29a7b81a67b58abdef1
97822 F20101221_AAAQKQ subramanian_v_Page_240.jpg
1a68e667d5b7666bb78262bc3e1c8d8b
39f7b8c21325cf7c0e7c44b8f306509bdf93c5e3
57732 F20101221_AAARHJ subramanian_v_Page_313.pro
97d3d441232b4d2148680c7b8a5dd717
983f215562a14c8fca45d2427241339e3e67386d
25862 F20101221_AAAPAM subramanian_v_Page_177.QC.jpg
481c865732436f9e729695f46b4aa255
8fabaa7167c916c1a4d3adb971bc3849e9919300
6945 F20101221_AAARUV subramanian_v_Page_088thm.jpg
7c559db030ead054242b74ccb675736c
334870785f81379e4e2d42d8f36fc10218d10f94
1557 F20101221_AAAPNY subramanian_v_Page_252.txt
3b839d3569dc8c2516cecbcb51812208
bab9d97d579e06b43d93a2a5d90501d167b2f3a3
97843 F20101221_AAAQKR subramanian_v_Page_241.jpg
a099f675fc5a3d3b9368012ee7355a19
dd65e96cf2adcd73cbad92194a2f391016db5f55
405 F20101221_AAARHK subramanian_v_Page_001.txt
e92881895cce9d5b76db43239f0735a2
efc3f7a1be6c85e127979bae634ac6a903a3375e
357857 F20101221_AAAPAN subramanian_v_Page_059.jp2
4112d3ca8556cb47ed1d103b6a480cd0
833badc7ea9028ac3c691ca715ace27bf54aeafc
24561 F20101221_AAARUW subramanian_v_Page_089.QC.jpg
51c713538874e7a156029c1da0433852
c91d72274595eaa2601e8774a3f52450f0a92e3b
8232 F20101221_AAAPNZ subramanian_v_Page_014thm.jpg
c6ff5f97357ce4015649da302fb61d02
67115b0cf90332e504e5a19f93006d7eae76e8cb
102302 F20101221_AAAQKS subramanian_v_Page_242.jpg
2dae8cfdda668a29eb94a9e57ca88bc2
cd47b9ae679515e1093cd15ac36a8cb472db16c4
85 F20101221_AAARHL subramanian_v_Page_002.txt
93da1688ceee3a4ffaa58d7124b4d4f7
47ab3ffe870a11ea84c9ba0a5a13239338d3c0cf
4841 F20101221_AAAPAO subramanian_v_Page_079thm.jpg
97e8b312badcece35970f8bffacc3cdf
6f6ad6233219d12b071c02d891b62f58c850104c
25003 F20101221_AAARUX subramanian_v_Page_090.QC.jpg
5c8fedfb18825a75baf4c7724bb9110d
22fe8564877a326d2c2f3413d2530f3154d99aff
91251 F20101221_AAAQKT subramanian_v_Page_243.jpg
5e87cc34db79de6618969f972f0af327
9038bd652a337f6d8ccb5b07fa305220a7140af3
71 F20101221_AAARHM subramanian_v_Page_003.txt
8ab2497a4dccc80615e9f5bccf928c9a
6a61d01928d05518de82ec16c823cd7956c087c0
48148 F20101221_AAAPAP subramanian_v_Page_301.jpg
d5eed845bd8032aba3072c7e6a2d1d9b
ff4690c4d32fa5058d4062ed951460fe504bbb4b
6810 F20101221_AAARUY subramanian_v_Page_090thm.jpg
8a3884b39a7c45c521103c2a064133ef
54f4d46f903de93ea14af88ff808ddd5ce1184bb
100005 F20101221_AAAQKU subramanian_v_Page_244.jpg
cca55291aa323be1a37ea0e78c39ce83
ec8a6950fc58ff16d5f3ac047d4403b4c14bdacf
1401 F20101221_AAARHN subramanian_v_Page_004.txt
6448cd24b9b08fd3ebd8394ea490bd1d
4e39421b18b6d8681ea95f9c7dc3a06caffd78c4
37447 F20101221_AAAPAQ subramanian_v_Page_038.pro
24ac720f4890148c3270498b9b02aca3
2f85024c02dbc4b232c4c594a7ed0b2a9ca8e5c7
5609 F20101221_AAARUZ subramanian_v_Page_091thm.jpg
cc8645ed1c093a156df8b31e2a3696eb
350eb9b96f2a2f1fb417a5104c8206ffe37f6dcb
110103 F20101221_AAAQKV subramanian_v_Page_245.jpg
01351d007a1f25ff3f28c820d20a12a3
4318181412205c43e242bec041099f58fde049b0
3534 F20101221_AAARHO subramanian_v_Page_005.txt
5adbbe9612d9aacf1d2f05dfd6d20358
19f431ba442729a5cab5f497fddc65d9456c743e
20994 F20101221_AAAPAR subramanian_v_Page_225.QC.jpg
1477bbb03d02099ae63a54a4b2954cf6
67e57821de21012849166b522020b893720a1dd5
59963 F20101221_AAAQKW subramanian_v_Page_249.jpg
618f3eed180732e52b474a8a2da085c9
9a5747466149964b51d788e00d2581f3f84a5132
699 F20101221_AAARHP subramanian_v_Page_009.txt
3678cce1869cf8f6a2e0e3b3364f8c0c
22cfd4948fd5e02133ad47958e979b80df815856
27277 F20101221_AAAPAS subramanian_v_Page_050.QC.jpg
a9a21846b628f1ca71ead42821946af4
73398fc356d724172dc8e8a909c7d17e0cbf81cc
99756 F20101221_AAAQKX subramanian_v_Page_250.jpg
2a20144c2985a913f52aa985cc0baa9d
b331fa7c4615668f8a5061bb450621d0acd52974
1880 F20101221_AAARHQ subramanian_v_Page_010.txt
90a062fec76d893f2a82573906ad6345
afdc0aebdabc470bd8ffacb606926ef719a64e92
813843 F20101221_AAAPAT subramanian_v_Page_058.jp2
d5f9749b49eccf01eee170aef3b8ae13
65f764df330511c2fdb7ca1e74cde1a6c2ecb39a
79541 F20101221_AAAQKY subramanian_v_Page_252.jpg
0442728856a9b17d061eaf4b7bbac0a0
1873eefcb312246c2dd4ee7adc12cc7175d365e9
2407 F20101221_AAARHR subramanian_v_Page_011.txt
d90d999c2155daf1cdb533b499239b3f
628e93739ee4ca8a7a8f32648a72df7aa6107ec6
F20101221_AAAPAU subramanian_v_Page_015.jp2
68f580ce216165660d9e20e1b50f8752
8cbacc4223710a790bc7ff82ae59559c58416c3b
100387 F20101221_AAAQKZ subramanian_v_Page_253.jpg
324a96eef7f6a9f29aaa71a26b7055bf
36db3cfd264135229625c81924d77d6dec09cc7b
2573 F20101221_AAARHS subramanian_v_Page_013.txt
42c279041e48ea86abb33b2d06324428
cdfc4d219c585543719883c5728f2c6862fa7886
2634 F20101221_AAARHT subramanian_v_Page_014.txt
f43aefe97e03e0a8124ceac4e3740654
45ebfda9bdc117ca999af46c900061e4cec288f9
12792 F20101221_AAAPAV subramanian_v_Page_273.QC.jpg
51c383d94997928609f5a03a8b1314c9
313d77d55dacb63b0efe7f496f91920a355bd881
1051977 F20101221_AAAPTA subramanian_v_Page_071.jp2
a8e681b0fda3b9afd1d1fe3c9f98fde4
d900c1d57377bfc218bbfd4b7552be179002c13b
2723 F20101221_AAARHU subramanian_v_Page_015.txt
15db2cee6315611f8dfd94d201a9c75e
718122e6e018a48654270e1000b8aa0f284976b1
841 F20101221_AAAPAW subramanian_v_Page_142.txt
d2edf6bf26afeeb5fffef08dcd072648
262e7d3bf921216671b66c9238d1c8e9491ebcd1
5515 F20101221_AAAPTB subramanian_v_Page_220thm.jpg
3364cccd8ddda71f125f1752c5a8e48b
fbd9f9d00bb9d5fdfda400110072b457d332ddd0
2605 F20101221_AAARHV subramanian_v_Page_016.txt
f3b0792fa76d5bde1cbc70e59d96ff61
87c9f4f5976c3af8f6bdb6d8b08318934a0c4a8f
1751 F20101221_AAAPAX subramanian_v_Page_117.txt
f78f6c06b018f64b9153c6f6dc775023
841ae8dddfd360c85d0e9292604f26dfc06e31ab
F20101221_AAAPTC subramanian_v_Page_292.tif
65da3551c1df3ad82439b3af59b0a84e
c495711fb6cb169698f48fcd0d508c1b9b0b122a
1222 F20101221_AAARHW subramanian_v_Page_019.txt
ee4463ef51340df25b3d9f4c14c1fc3c
86228ef662a769cf27dd2b63310434c332014701
782223 F20101221_AAAPAY subramanian_v_Page_154.jp2
8ff1735007bbe2172f981c1c21a0d525
b29eed0302ed47418818329e6a87a866745438fe
103813 F20101221_AAAPTD subramanian_v_Page_228.jpg
c8834dff3f02f2cb4c5e5424cd196c58
0a4bdcd508a09b27db3d077a4aea50a17d587ef0
1905 F20101221_AAARHX subramanian_v_Page_022.txt
0577b6db2e973e7ef65c186ed21f512d
2c3a0e266deb2d5502a81aa04f088a56f0882311
3701 F20101221_AAAPAZ subramanian_v_Page_007.txt
8e1ea792a5406e22ef35320e66fca44d
fc40649fdf3391f2b2f8806381b5c6bbc837c42b
9096 F20101221_AAAPTE subramanian_v_Page_219thm.jpg
0038882b5a69a7fc1001ccb4b6ecc73d
a7e17ef5ad77e6da0154670fece04805c97e466d
2088 F20101221_AAARHY subramanian_v_Page_023.txt
f057886e6f159020bb1f872236b56021
40aa365bcae8ba4b092d41451ecce10418868cc0
7414 F20101221_AAAPTF subramanian_v_Page_125thm.jpg
c97b2d9edc6380ba3751d39017712fb4
d4b3a7fc4140226fb1ffa54de58f757fcc08d8a9
1862 F20101221_AAARHZ subramanian_v_Page_031.txt
2c5d97daf83d45d862cc174a162cd895
e1ba0a085797f33dfe3322aecc76e70c705f636f
1051942 F20101221_AAAPTG subramanian_v_Page_250.jp2
414ec7cf3198af3fb416945454bfaa4a
1da19429c188438983206dbff5dad7b866033b7c
F20101221_AAAPTH subramanian_v_Page_172.jp2
e9b824a699983c44be5812c7ada48ebc
ecb4967e1ee0af2e6bbef661f8c38b999b2a7f97
1051944 F20101221_AAAQQA subramanian_v_Page_185.jp2
d7ff3744333917420b44f7bee040bab1
af2f847d475846563c7b25317af5ebb4a296efec
F20101221_AAAQQB subramanian_v_Page_186.jp2
de244a399447a004f1b068fb601a423f
f70e053f853cc8ddd6d57874f5b614f8a4f24ee6
23887 F20101221_AAAPTI subramanian_v_Page_231.QC.jpg
995a42e8504eb4a6109ecb3ed2e0c51a
6c290bd30bccaf272c72935dd0635f074e84799c
765321 F20101221_AAAQQC subramanian_v_Page_187.jp2
075a69b6506f7412cd8e5c287a687a25
fcd3b3913d18601ac062415458d91228193eeff9
23432 F20101221_AAAPTJ subramanian_v_Page_304.pro
d456c71a2e66bbf8c7703b341ca7617f
0a6ce8082f2fbc2293dc55232b6aa5e322ef6dd6
975459 F20101221_AAAQQD subramanian_v_Page_188.jp2
f08ba5e3956dc6e2199ceda9306d2d27
fe825d1c16dab91f6d9ceb1147a50a542e65b245
7802 F20101221_AAAPTK subramanian_v_Page_199thm.jpg
d6f50b5446f506a8874ab9bd843925ce
17a7be5fb1e05f139b342db19da400980376294f
8530 F20101221_AAAOWR subramanian_v_Page_122thm.jpg
898f9d1a8f34b57a3760acf55db34dd2
502206baa0a18e0c6c0da9e14450166f0214fed3
33326 F20101221_AAAPTL subramanian_v_Page_209.pro
3039a8aaed9e33be55f57c673f7cd747
0cad0f87226e07203705747e6d5a0797fb66a7c1
F20101221_AAAQQE subramanian_v_Page_189.jp2
cc6f9b9b26ed4b47ede436ce753a256c
83c2ec72f1d2312ba674a09ac5e2a488ddff670f
28142 F20101221_AAAPGA subramanian_v_Page_216.QC.jpg
e1d9034a564ad0966dce2bfbc9a48aa1
98a22d0c635d56e294f3151daa36486ef064ec5a
26255 F20101221_AAAOWS subramanian_v_Page_127.pro
abd110a04700c7742be887c28058209a
91f3ad30f6e466acbebfcfb565b094d6b24ec2a5
34653 F20101221_AAAPTM subramanian_v_Page_184.QC.jpg
566b11a2368a8755491ad8e0f17ce1b6
15f2505989a1348a19d38686fe8be0b6fbc309bb
1050926 F20101221_AAAQQF subramanian_v_Page_190.jp2
c7507b2314915b8193b65509845b44ca
41be83e52ba478592507c13e4b5f334f94339553
7922 F20101221_AAAPGB subramanian_v_Page_016thm.jpg
61eb744e2a62b5d96093a9e3c8216001
6f08af68053d0d45d4d37d49c3b84e3f048bb425
52333 F20101221_AAAOWT subramanian_v_Page_282.jpg
8b0c97401c325fadb1d07c4a7a96d87b
88ce64393f23c498edd3ebcfbd9834c131b3fe50
94656 F20101221_AAAPTN subramanian_v_Page_221.jpg
50453657cf09084a83ef9507111ced3c
8b464fb325232c82c2171ae647233ce5b81c3ce6
F20101221_AAAQQG subramanian_v_Page_192.jp2
3d0deb4a94f06ec70e7a0178dfec4747
7301835e5f868b25affe9dd3983932935b9478d6
6372 F20101221_AAAPGC subramanian_v_Page_178thm.jpg
daca807731be2e458fc2eaa9bce92960
e5a5c418dbc7a70d33ee23224b76d37db74cd467
F20101221_AAAOWU subramanian_v_Page_009.tif
92e82b18401ee715a8280d5a1d4fe90a
587ad3ef27b0187081bbe77137e0a857e9cf99e8
1597 F20101221_AAARNA subramanian_v_Page_235.txt
e0d7d1d8293492d1a4945d177778fe16
f0a7d7d0268396856ee1a326d3f1bde6480dfc04
802257 F20101221_AAAPTO subramanian_v_Page_102.jp2
db453476be41b373adf92ce5aae7606c
0a69fc9db691dc8ae51fedfc69cae5763ed7f9a5
1051950 F20101221_AAAQQH subramanian_v_Page_193.jp2
203475f16e3ffb0a217b5913980b2209
5670bf61f6a29b69cc0d551feb128d23c8df8cd6
21331 F20101221_AAAPGD subramanian_v_Page_183.QC.jpg
fcf14063a8f790108f425d10098a0461
e6e0364cfc576b7437e77940826fa3bff8b1e5bd
93947 F20101221_AAAOWV subramanian_v_Page_010.jpg
6efd2ce6424ce441e007fe4660923c66
b504e6ac33f48b175de1746c71fa24fb9c03fc4b
1645 F20101221_AAARNB subramanian_v_Page_239.txt
3281d18d15b0d10cd0bd0f4ef3db72e2
b91b863ae4cbd173905934d65ff0a6c531a5f291
1334 F20101221_AAAPTP subramanian_v_Page_073.txt
e0c722d252713def569cedfd8b645e25
7d17dfedbc4d647a9ac54d0a54db9082ba61bf98
1051960 F20101221_AAAQQI subramanian_v_Page_195.jp2
076e15e9e317a3117cd484e49631cb74
889682725dbe6f37db33396cca71d87aeeb2590f
97880 F20101221_AAAPGE subramanian_v_Page_145.jpg
599c42a27372cc65e621ea0040f03e04
11edd9d59ae8470e6c094b1b80e915aa30c91f65
21765 F20101221_AAAOWW subramanian_v_Page_040.QC.jpg
e75c9b0734d8f4280ea929eee4de1562
95b2ffd6db5d92c2f58293ff7b0719f5e7438f86
1738 F20101221_AAARNC subramanian_v_Page_240.txt
5bf8a9b0c3c8c63fce21492d556cf7f4
12ae7f791873d9ad4e743fdd43a8b0af07b62fce
24778 F20101221_AAAPTQ subramanian_v_Page_279.pro
de0557a98bc141a962c871189beef3c7
b7556a939db070d960c61f6f53e2a3fcf8c0e547
F20101221_AAAQQJ subramanian_v_Page_196.jp2
c89de3f564f9a9855f0a7c97204d2e3c
e342df96c15310fe0c19eeca6a23e675eb0fb654
543274 F20101221_AAAPGF subramanian_v_Page_217.jp2
6870c1cfa1d59bfa9fbbce05521d32b5
50ad614fc9d62df233753fd33877341338e0aa83
F20101221_AAAOWX subramanian_v_Page_015.tif
5d5086ef4a78f0b761a5ace0671f3735
819ddefd9e48081850a3479c4d905c2b70b42ead
F20101221_AAARND subramanian_v_Page_241.txt
10eba13f9d4aef2d8b487a6b271eea82
b3f6d3beabe4ede7b9765267e2a68168c40e364c
F20101221_AAAPTR subramanian_v_Page_058.tif
35c079cf1fc94599b33ab2f35c0cb098
e57a231ff454afed33d80b8203f74e1eca04ce4b
F20101221_AAAQQK subramanian_v_Page_197.jp2
ec2e7bf690b085d4b5d848641b2b4344
3aff0fa96fa7de801d18720b65167ec7186a5025
31809 F20101221_AAAPGG subramanian_v_Page_190.QC.jpg
30ed574b0e64b9553e8e3c28faf65fa8
b64055c2ffabe49b29a91f0a2f95d28ca8c82926
1013711 F20101221_AAAOWY subramanian_v_Page_087.jp2
c3aff07b5faaee66912f9402f912e460
77678e4bf29e30e02cb8291d4dc5c15ea8c86404
F20101221_AAARNE subramanian_v_Page_244.txt
324ae8841c2a9fbd907d98faeef5eb53
aad21c89db8cc73ba4184b8c0fdef05bdea2afbb
23245 F20101221_AAAPTS subramanian_v_Page_140.QC.jpg
1a76f9b96bb1d5682aa9e47a2cbe46d2
54fe8a50d3f91bc2df4887d9f7fb9170a1fcc39c
745524 F20101221_AAAQQL subramanian_v_Page_198.jp2
03ef035d5d15d006e18c5d7398628590
479f39d99a5d79e62fa2b03e62b7ec8617677db5
94368 F20101221_AAAPGH subramanian_v_Page_197.jpg
a3b7eb132c410fc87e591f4056ec6e5f
3f6a526d5ce78fd067527869264b11e2a52a8cf6
F20101221_AAAOWZ subramanian_v_Page_075.tif
498787eae0d33fc55d74857769919ddd
a1e8a52fab978c17ad7984f53618a99074d1ee07
866497 F20101221_AAAQDA subramanian_v_Page_050.jp2
64024f108588d43debb1b45f2ee90543
1168306274dba0a78069d030d9722cf88b357d7b
1802 F20101221_AAARNF subramanian_v_Page_246.txt
c92316175ada3ad20b7c089b7bcb45ac
11e3b7b364fa3a333ebc3a8d8aad2b2b2703b2b9
70691 F20101221_AAAPTT subramanian_v_Page_286.jpg
b0ef909b5da008a3d12c72cf335aba59
667480d2b54d7c95ee72df096eddd7a4615b46b2
990039 F20101221_AAAQQM subramanian_v_Page_199.jp2
d2ca2c0c1b658e8870de26f5c389a2c1
b12cfa93dab6723e67ae2660c7c5c7ce81dd56a0
2015 F20101221_AAAPGI subramanian_v_Page_123.txt
63c134355c48103aee6e7b1211bd4b29
c27ebfff38b4fe2056dcc3d7c68be05546400be8
F20101221_AAAQDB subramanian_v_Page_148.jp2
bf689ff14ff4cbeeefea0a7e290a108e
8d215be841734553774cedf3ddf9a6a5ef0d0700
2005 F20101221_AAARNG subramanian_v_Page_247.txt
54844a84c2d7004b2a9cf502ab992460
09d74e1fdeaecef12637c4e6b74fe15e167d6570
36876 F20101221_AAAPTU subramanian_v_Page_163.QC.jpg
cb73ffb45ea0c619927b421bf97a9ae4
b583ce780506ab82718db6ca4c4f264d0cee15e9
977579 F20101221_AAAQQN subramanian_v_Page_201.jp2
aa4bfa3f865970605e09645881a1876e
cd703e99da8be9a1dbdc616fa7e0e06ceee95864
2114 F20101221_AAAPGJ subramanian_v_Page_167.txt
06186c81123f1d517d980b387be8b5c9
a6d27b9d8d2fcb0a419bd90ea580914ff7a6eb96
F20101221_AAAQDC subramanian_v_Page_092.tif
3b7b5cd02b6148904122eb8eff5ab983
c2988b565f6b3dd5319630268e5769b09c54daf8
1978 F20101221_AAARNH subramanian_v_Page_248.txt
1d1515f7133464433ed2a493ab575acd
42dd1e3fe81b7ed0c313b4780dcce65689bc2acd
F20101221_AAAPTV subramanian_v_Page_139.tif
2dde13a6fbb825f89c91934cee631fb9
bbce53c0d583d862fd873f462b2e003134d60bb8
975080 F20101221_AAAQQO subramanian_v_Page_202.jp2
c3742878dc3ed2bea3ba400d75a91425
23071fa2315cd05c3e8ed69a099156d58273175b
F20101221_AAAPGK subramanian_v_Page_053.tif
e468a35ce76e9f8e4423f659eb61beca
91f59951a6b0075a632d8a5521656ced7d798be2
100359 F20101221_AAAQDD subramanian_v_Page_134.jpg
98c12bd1e125c0663198c13f9b167c52
06927e9bfec7335eb53d5a13cdc452c8725d1711
1302 F20101221_AAARNI subramanian_v_Page_249.txt
d0892649ee1638d737c0af383a3cae5c
e3c4178f12213b1cc0f027e5a31af25336168d79
1137 F20101221_AAAPTW subramanian_v_Page_134.txt
4fcdc37911bd1d0e15b90188db4c1b69
36d9071d19d518dd7391097433481f7c6920664c
1051910 F20101221_AAAQQP subramanian_v_Page_203.jp2
080977aa5b3008e80e2f6acf07f1abf3
3405df775fb940bdf94b040651600e11ad8df119
F20101221_AAAPGL subramanian_v_Page_244.tif
4994b053961bb52d004b3a84aff88931
f1bc324d841a02d69966fcb5f551d7e7a85ddd34
27639 F20101221_AAAQDE subramanian_v_Page_090.pro
8302ca59226c6e56bd2b6498792290cc
41c5f916cd03489d78d17eea4dc1b867963c96da
F20101221_AAARNJ subramanian_v_Page_250.txt
d549e58f8b20465c00ee8ad2e9c2d077
3a0cabca5782bbc54030f4faba4dbfc22d5e6871
F20101221_AAAPTX subramanian_v_Page_003.tif
63d3f19c328e01298d0961c299aac4d1
f8f3c642a1d60a0137c9ea14d1989b8451f8404d
F20101221_AAAQQQ subramanian_v_Page_204.jp2
78d2ee49e7b2009a00737cc406cf804f
ee6eaa896a4e0e51fef43c4b6bbc07012019a1ff
627707 F20101221_AAAPGM subramanian_v_Page_017.jp2
f89b20c6447a4cba2dff7448a692eae3
1ccd96bc6e920f7cafab89c88371f72d61c03ecc
578202 F20101221_AAAQDF subramanian_v_Page_277.jp2
802f74c3866ea94bbeb00caf2169d7e6
498ac1b9116eeee70b0678405d110a88b942d3c6
1216 F20101221_AAARNK subramanian_v_Page_251.txt
0ae0d3fb11234b017667783658f0eb65
d578f891b42892a02ac50288475f1222a95cb110
8958 F20101221_AAAPTY subramanian_v_Page_047thm.jpg
c0d0f448bcef56313f8367a8ec09e682
b894ebe0d194ae3a5191b4278c07a4f5546fe607
968328 F20101221_AAAQQR subramanian_v_Page_205.jp2
d7b56fb330d3835f1b9c6b6c18f8d338
eedcd4fc073e883fd44d8752936caec538ff6dcf
1051915 F20101221_AAAPGN subramanian_v_Page_212.jp2
5c2b9682588ddfeb6c7f32c1d9d02adc
56167e7c6fb9e7ad52c9a0c6adde93da7916d098
34121 F20101221_AAAQDG subramanian_v_Page_266.QC.jpg
dc932f7af7e2c232def022338bdd7348
0544484af49341f35279db068dc999bb77b593c8
2027 F20101221_AAARNL subramanian_v_Page_253.txt
81fcb2641e514e252bf026981068a63c
748d3918254c45950cbac94e96edadb36c894459
3687 F20101221_AAAPTZ subramanian_v_Page_273thm.jpg
f31e53f7b00502774d940ae47df04115
2559e54fb9a0a742739977bba13635c860752268
F20101221_AAARAA subramanian_v_Page_302.tif
d7731b26a8e4cc31dee4e4b67f8bbb83
9a4804ce0ba7e75f1a0a65b2dbb4319e2f3c3234
926517 F20101221_AAAQQS subramanian_v_Page_208.jp2
ec3e05e18fb5e94b1561794172ab4956
a6457ccf1d1ecfcc34e466688fa8535abfd6806e
113894 F20101221_AAAPGO subramanian_v_Page_128.jpg
ff106ee006d927f1b947f5ccb2299f4d
583230e3597ce3699f2919b12a56412f5552cccc
1881 F20101221_AAAQDH subramanian_v_Page_213.txt
02f2fedc673ced484b405c1308066aab
4aa51ff7ac648be31c7841bf6425967a258a3351
1449 F20101221_AAARNM subramanian_v_Page_254.txt
440cbdf9e09de29a0047da672fb38bf0
268036a29543dd3440f56c65a6312f4e160daccf
F20101221_AAARAB subramanian_v_Page_304.tif
58ca4d3f5e03b6724a090a6b934435ff
986ff6e6115b68788f8e32c2ac361bc645cc44c4
F20101221_AAAQQT subramanian_v_Page_213.jp2
85fd93c8c03dce1e68ccea20e76b556c
da72bcfb5e9cef8579e951aa9f9fd5721b21b26c
74082 F20101221_AAAPGP subramanian_v_Page_008.pro
bf167a31f1d84711bfe1e16da19601b8
e519a75fd04df0409f876062666694eb1cc216b2
99815 F20101221_AAAQDI subramanian_v_Page_051.jpg
323d554477632e16accdba9a9e37359e
677abeb933ed34f60218d7296428a26f9584677e
1580 F20101221_AAARNN subramanian_v_Page_255.txt
c62fb9ccc7f3bfc2e809a03ce66b3899
a304a55cc3b36691f662f42e6cbeb697dfb340ae
F20101221_AAARAC subramanian_v_Page_308.tif
7de0711a66399e49ee8be235b3b7c852
d3b836e852f46d6906be0538213863432e342959
717202 F20101221_AAAQQU subramanian_v_Page_214.jp2
5fde0f15fbfe38b432080ca6d1907e0e
8aa7b8aaf321feb6611c2b053e2260ebd82d5200
15262 F20101221_AAAPGQ subramanian_v_Page_307.QC.jpg
421b879fba37be1b00a6e4fd63119945
12b49a0d8458a223e37b7e1355293a072f897e35
1700 F20101221_AAAQDJ subramanian_v_Page_119.txt
3a279a3d175a75f142b6a24423c1bc13
10f776c3e2d1908cb59dcc1b472ddbcf8040e88e
2484 F20101221_AAARNO subramanian_v_Page_258.txt
1f612665fe7194eede586fa73b98bf39
f4c12ac45a0cbc544e474d6fdab62e4170244b14
F20101221_AAARAD subramanian_v_Page_309.tif
9aed4cc67af09ac0451a591ef5c36727
8f2f4aeef75bc9215e60ae6fe23eee6365be7194
1001379 F20101221_AAAQQV subramanian_v_Page_215.jp2
981767b3b1ace616dff5027bc542e633
52a9a62eadc6210f4f8d15620c9d693e8cf4feaa
F20101221_AAAPGR subramanian_v_Page_298.tif
c97dafc4daf3ec2a416d17520b624663
ffb967c5ebd917485ada1d462b1d8783844fb7df
206 F20101221_AAAQDK subramanian_v_Page_180.txt
33fafda861b34d92e94e838876fe5f6b
508111f11b23dd99567425e51de70d6466672ba7
2384 F20101221_AAARNP subramanian_v_Page_259.txt
a2a32e2c0c8439032fa18e1dcaee61c7
4ec8259cd4676bd2ef6f01a13c98652cf5de456d
F20101221_AAARAE subramanian_v_Page_310.tif
c3af1a1c9a31e70f1e8a71d8b0c29c9f
a2c6c07cb308096a54855e74fc1184c675ae62d4
765179 F20101221_AAAQQW subramanian_v_Page_216.jp2
31ff31bd9ffee000330babe2919d3e17
6abf93e38f2aebff4a413c394330522e3fe8cf06
F20101221_AAAPGS subramanian_v_Page_158.tif
50e8c00c432118e6ed173cbd1b8ccea8
640606fd96ec44cad1c8fb553383931b4a610f63
32987 F20101221_AAAQDL subramanian_v_Page_152.QC.jpg
cbc71e8a7fcb1ca315ac89932b0ae46a
7dd675710db5a987d5f2ba921042e16fd6a7ef20
1753 F20101221_AAARNQ subramanian_v_Page_260.txt
737e64beeaf0ad594464b969e3684c72
bd635f5e3f986d71f2ae7f430728f938e79d4632
F20101221_AAARAF subramanian_v_Page_311.tif
f89ba57fd304225080489a73cc35a302
c49aad8d90d301523446ca2e3a162ab6312cb958
1051955 F20101221_AAAQQX subramanian_v_Page_218.jp2
bb1d4c093aa519b105b1c33b95d2f90b
1899871c69648a5e626d92ec10aeb6d7cee40b70
26289 F20101221_AAAPGT subramanian_v_Page_063.QC.jpg
e5b3fe8aedeb0e7863cfe6128438f720
809dbdae588cf50f8178ee7a0c5b473fdf9e120b
9102 F20101221_AAAQDM subramanian_v_Page_022thm.jpg
6b05aa84052d422334c06d8810d8d193
e8ac98af2bbabdae54b6b4183b47001a596dd331
2239 F20101221_AAARNR subramanian_v_Page_263.txt
0b5af56a442ec52d4f09b279a6afb39d
114011c16bf2f0f6d362be0fc379106f3e56ad48
F20101221_AAARAG subramanian_v_Page_314.tif
036c4dd3bfd162f1530d3d834c05f0c7
f220bc88ca02982a9d8ba412ba6f40ed20f37d17
1051957 F20101221_AAAQQY subramanian_v_Page_219.jp2
41004c273e5da1c773bbe57fb12fcfd1
c6e38df5eb43f72565543a0cc3c25f3a569e5c15
2180 F20101221_AAAPGU subramanian_v_Page_245.txt
ae9a99a27b4b60d7f57750a56aab7e47
caadeabe71ee78e30c52437eb86d8f4a2f7563b2
36817 F20101221_AAAQDN subramanian_v_Page_115.QC.jpg
d67cf8c1648641b6e4f3f151cf06719e
b14fa377d545899ab1af6008aedd5416d3e686b5
2090 F20101221_AAARNS subramanian_v_Page_264.txt
f334626424c71b2eee9f41de94596821
b1fd29e459df150d7877e8698b76b4fe5559c04e
549 F20101221_AAARAH subramanian_v_Page_003.pro
58e4785eb3e5bded694c035c5bfae2b4
e035e62204d3379deda493a763ee79b91cafc23e
694301 F20101221_AAAQQZ subramanian_v_Page_223.jp2
341797dfbf183aa1d856ef2651eccdf5
6da106457f7e63d2ccb8f9de258c2ce4b1d7b457
8042 F20101221_AAAPGV subramanian_v_Page_202thm.jpg
41742aebe1fd4a2151247ae889bf04e4
e48e1a885a01a05782938e0fa8147281a2f5e7f1
1248 F20101221_AAAQDO subramanian_v_Page_211.txt
5b2805a02dac4a25e0ff798359ea559f
b1c6a70fa7553e361ec66a24661d5f1f61474ff1
2051 F20101221_AAARNT subramanian_v_Page_267.txt
b3587732dcabc7f118a757d38eca4a5c
00f87b9759eab77cec858e9aba44b1c5ac7ec906
79053 F20101221_AAARAI subramanian_v_Page_005.pro
ac4ca06e3fe796dbf4ca1c57cf0997f4
71bf8ac33b672ad1083b91ae19743d91a5073792
30373 F20101221_AAAPGW subramanian_v_Page_198.pro
0c67913cd56a2d2b7adc7ed3753d01f6
337b349f2aba434b88727ef8994062c5ec869c12
97787 F20101221_AAAQDP subramanian_v_Page_235.jpg
db705c28fc9480e4535861535292164d
c410f85ac60210a4ad8e1a65b93c04b94fbc9033
F20101221_AAARNU subramanian_v_Page_269.txt
5efcea6c9e10c521cf84ec979acc032b
6a6023849012bb2cd465fdf5fc4e90591f3361f4
96436 F20101221_AAARAJ subramanian_v_Page_006.pro
54545b95daf874d75cd2647f32dd6e55
336ee1c34b746e56420b33079b478003a964fe51
6562 F20101221_AAAPGX subramanian_v_Page_254thm.jpg
53e5844bb1299c34e86de9d559a9bdd9
e7f0a7886b48efa7d4d45164becf91cad285d6e9
100989 F20101221_AAAPZA subramanian_v_Page_248.jpg
003b767bd63fb30cae4d85754931fd0e
dd19fe35765480c9bf403ad23a3d40fbad17d780
23540 F20101221_AAAQDQ subramanian_v_Page_301.pro
bdcc955877d6e96cd4d68c23fb8f702b
d538a1cb166e1c4e7f1ca27d548084d77a8d3be3
F20101221_AAARNV subramanian_v_Page_270.txt
830dd8238bae59955e40e9d1e947eeef
9465d701b7437aa0a46e4dedb847280ae1ba0e8d
86891 F20101221_AAARAK subramanian_v_Page_007.pro
86a70e3626b3d8ce5e9b1eb509e064b7
46e3dd3ddc45adc6f20f2788fa9d931d21cc0ed9
30338 F20101221_AAAPGY subramanian_v_Page_039.QC.jpg
9eb121cc77befd16a051c5731114167c
226eb140f61982c0927447f706ecd614601ad1bd
952 F20101221_AAAPZB subramanian_v_Page_290.txt
304adf8859fdfe820fdf9d1205610c9b
38dda4f209a2ddc209388d2dfddb805771f5ecfb
29629 F20101221_AAAQDR subramanian_v_Page_133.pro
7bd9d055a58e932ccf7615be7545d410
dbf3e18156f7298ad9a7b04bef585bb14cd87675
715 F20101221_AAARNW subramanian_v_Page_272.txt
aa1677b7f7e817ed28a2e5ad262bc8c4
7d4ef23efe5d83396bbca74bb1376d7415343c8b
16959 F20101221_AAARAL subramanian_v_Page_009.pro
6a6aea54706acbb34e67efbcaa381fd6
fd3a28fc4f24b707d4a74f0a7d79680ca74666b4
6756 F20101221_AAAPGZ subramanian_v_Page_080thm.jpg
7e03a9e2304fee20907c2b86bf1973fa
0ce11f393893ce400a12603d8057b4b7167f011f
F20101221_AAAPZC subramanian_v_Page_013.tif
032e2ed9390ae03ba6b7d68cfd2a4a9d
75840902f1a1cb2516409bf95da18a756c8ad8b7
F20101221_AAAQDS subramanian_v_Page_081.tif
a95ee6f02c254e8b08737d322113634f
db282575df6ee51f78c02205809bbbac16957aa3
679 F20101221_AAARNX subramanian_v_Page_274.txt
5242e866a8636efe70c5bf2bea89ebe7
7f94d8328037c216ab1f036c809e4bf898b5902a
59014 F20101221_AAARAM subramanian_v_Page_011.pro
e446d0abb238332910e184a83c7852c9
91b8ac121b42240e5cd97d521649d59dcad707b1
14337 F20101221_AAAPZD subramanian_v_Page_078.pro
923f41bd4229d925846501183c93691f
1b517597121b45d1790946359390b5191807d817
50824 F20101221_AAAQDT subramanian_v_Page_266.pro
aaf45d12c21b4e4db92ba50c3524f6dc
7c6ae78f4abe115be0cb4c04b266e56695086e92
1408 F20101221_AAARNY subramanian_v_Page_276.txt
d47a28717c769b08552935a95da1c22c
6720da1105f0ccf53470b876cd66afb2cd7d84f6
65050 F20101221_AAARAN subramanian_v_Page_014.pro
2fe4000ae06f62ad303542ecb2140e55
a7e32e29f17114448f7c0192d675de511655186c
7883 F20101221_AAAPZE subramanian_v_Page_234thm.jpg
d77b0e5929b1e87390734ec38121c9d9
990a579e6f775aaa6a75f19b32df400ce01fd6fe
46650 F20101221_AAAQDU subramanian_v_Page_052.pro
41beb04bd88ab0016c76c8f3b2e9c931
5af8b787b4c5c9f26c41a7f2316b60238af376e5
1209 F20101221_AAARNZ subramanian_v_Page_277.txt
67ceff6682e95a26db2eef0a94925872
1c5f50134a7442deac1f9ac5500a3452aa95d053
67442 F20101221_AAARAO subramanian_v_Page_015.pro
da930f582563da208a9e375411a3f441
a6a4e4e2354f25d51773c1768ffd38949b039593
F20101221_AAAPZF subramanian_v_Page_207.tif
ff10d305b5caf57af83c3cc9141a6ab8
0a5f7b1e4adf95559e34ad3716ba98769c4c9ce9
1096 F20101221_AAAQDV subramanian_v_Page_224.txt
29d262ab7ed5755a94b755aa8e0e52ad
5c1cff0e534280cc241c258b17e9fcfc61e8e05c
63687 F20101221_AAARAP subramanian_v_Page_016.pro
6e507087893f4f9b9adfa83379a74b59
e081bc968006e2a5b37cfe07cf767b192abb7128
F20101221_AAAPZG subramanian_v_Page_017.tif
6d787d3ae4c4c0cc997939f11a60dba3
542d5a32c0a6f88016767e7dd7f1c9e19c98929d
85651 F20101221_AAAQDW subramanian_v_Page_125.jpg
d61a3137a901a22d63202c1f72a7c3a1
597be18847bfcd4a5d1f6e03cc0c453ed4028980
F20101221_AAAQWA subramanian_v_Page_140.tif
80d84f2f2bbd96f2349c6c57706ea0d2
0196ea9d39d9d9447cb630a07eab2327d5d0384d
46160 F20101221_AAARAQ subramanian_v_Page_018.pro
9df6d612e0ab612e114d3debaa4cd6b3
1843dd068c4e13732f2dd70ecfac5c29fe7aa9ca
7595 F20101221_AAAPZH subramanian_v_Page_143thm.jpg
6581bf6a1c751d4e1ca0982ff7cf7b92
6a5661c02d0a6c381fdeb441c2fb1cc4e1be9eb6
F20101221_AAAQWB subramanian_v_Page_143.tif
f6d97df859b648d6df483105f64a1410
9e94ef5590df563409ffdfeb3cd997c8f47397bb
30644 F20101221_AAARAR subramanian_v_Page_019.pro
b4eaf2cb2c6096efb020cb7c875a90a2
7278c2fd47ec6545c6f77c412d6524f334c1e670
63425 F20101221_AAAPZI subramanian_v_Page_012.pro
86740dc4e941b10d6dffad4ae1e2eab7
adbbb6f2476c9f89d9ef7bc18b409b3d6aaffcbc
35236 F20101221_AAAQDX subramanian_v_Page_114.QC.jpg
55ddf2046e23b01b2724574ba1a9df78
2b0ba65d02731b2a7eec7168d023df99db8ab303
F20101221_AAAQWC subramanian_v_Page_146.tif
9f5b96c7252735f26bb85c3980791609
73a0b2bc050d8fc423c39cd1871d11f80fcef43b
50082 F20101221_AAARAS subramanian_v_Page_020.pro
d2907410b619f9473630cc382eb15780
26333ca969d6f2227d68e1896e58bbf03c82dd2c
8669 F20101221_AAAPZJ subramanian_v_Page_164thm.jpg
6cb116c4479a5227661a915b7b2baf69
00a375aa67659056cec0f24aa44a4100444d96e5
35533 F20101221_AAAQDY subramanian_v_Page_245.QC.jpg
5fdceac2a4bd7b22783e5e6a7b00e7dc
2710b24f0d345c647419d27bd61ac2ec27462e85
F20101221_AAAQWD subramanian_v_Page_148.tif
fbe76ad6c4f9cc638c0885afd08b6776
c48fbf9341b4f58d74bca0d537319da46acda626
19175 F20101221_AAAPZK subramanian_v_Page_076.QC.jpg
ecdbeab32fb145b0dd6e9480fcfe117e
67521613b151e52726c7e4890da1ef9a0295a760
19763 F20101221_AAAQDZ subramanian_v_Page_057.QC.jpg
db0edd0f844544bf0a2a5971a33d3844
df7c1a646775daf74480ff85865242967cf57870
F20101221_AAAQWE subramanian_v_Page_150.tif
7bbb07d088d2cad25c7fb05994021a81
9375853b7ba6bc951a5cf8488f69a4d9dc0450be
47077 F20101221_AAARAT subramanian_v_Page_022.pro
0eff9aaff645f3c97621b4dd54cdb93b
7a61a00e5b8aea1b4e7c9761c00f6f3b060e0229
7188 F20101221_AAAPMA subramanian_v_Page_108thm.jpg
7be969206c3a168235c883be06f1d509
20eced853ecc526bc224865677bb12baf832c4b3
87130 F20101221_AAAPZL subramanian_v_Page_111.jpg
d08c1f12eee077999bf19646da76e51f
adc5df14289317f938bb739592411630a6f06177
F20101221_AAAQWF subramanian_v_Page_152.tif
ba334dea9b3df934df03b237bab27d77
91b839d127029177c8d2fa9ac5275b93aaaf2e45
44497 F20101221_AAARAU subramanian_v_Page_024.pro
f271b431476167e54bef6703ea3bfcb7
abad924e566c388b15d08411a5b0068e90f7c613
1015675 F20101221_AAAPZM subramanian_v_Page_062.jp2
92f25b96ab36fd28760fcfa4dadbf296
4e50a141885cf80994a95f7436a487a25e6a3a75
F20101221_AAAQWG subramanian_v_Page_153.tif
6da9aa71ed482f96e1afa5824a8faa08
c85daada15f552ced6a8a1647bb336e5b7578686
28001 F20101221_AAARAV subramanian_v_Page_025.pro
1fbb2986f3b7a7742dec2dee838fdcb2
284aa1eb9f5f294a98500697a65b19e667d5e97f
5100 F20101221_AAAPMB subramanian_v_Page_275.QC.jpg
663837d0be239e636b9e23e46db20bae
2afccb38d329027c845fbc5cf2c780d516ba1933
108923 F20101221_AAAPZN subramanian_v_Page_264.jpg
12419f293acd562567cdeb9151dc1586
e63ba88ef26e5141e0026a3781a1683a3c2c91dc
5833 F20101221_AAARTA subramanian_v_Page_040thm.jpg
db9fc3edcd251c16cf32941d5da7b4b9
55366bad0bbf2396373c07b317a83469c0d15b31
F20101221_AAAQWH subramanian_v_Page_154.tif
c81479b3c4ab7bcb9750fafa3ac18c34
7b8e6327f4f0ed45c83b70acd948698f38b444fe
22193 F20101221_AAARAW subramanian_v_Page_029.pro
195a5e44c2c5e807d369c5670d7ac910
554377a41581ebe02804cc8ca034f969e939be82
F20101221_AAAPMC subramanian_v_Page_277.tif
63157c5951541672e90b1e1038ed4e82
53e6d2986a2e4d2808ed73ed2b46b403932afa2a
8261 F20101221_AAARTB subramanian_v_Page_041thm.jpg
aa44fffae7a042ba4cca8a6e2cb8c0e2
b59317040cbf9ab928c479f1dbeb6808c1e20110
F20101221_AAAQWI subramanian_v_Page_155.tif
6c789aa5d5d0d2ea1de526fefe12d30a
7105ac4b66dfafb89fd228badb63308b2272c48a
46823 F20101221_AAARAX subramanian_v_Page_031.pro
43e2ca5d3901cf8d4d0e8950a1739346
bc4f40dcb996c2b80cae00a8bcdc8276230e8ff4
2188 F20101221_AAAPMD subramanian_v_Page_033.txt
86beeabc1e9143e49105b79a9f4f7386
8da22f3bf4e43af735766cf3eecda41201279d39
765051 F20101221_AAAPZO subramanian_v_Page_068.jp2
8f9d9b3d6e27fa7d178905ac46f5f37e
1e9667409072120185fbe70905363f01d2510b74
33377 F20101221_AAARTC subramanian_v_Page_042.QC.jpg
592bbbdf0ce7d87eacf88cd8cc7b7312
2ebf9efdf0ad54555e73c7657ae061d8ed36ea4a
F20101221_AAAQWJ subramanian_v_Page_156.tif
f6509a7400dc602101aca42b105caac0
bfc4eca10394528bbe42df7aef23b9f03df1962a
55828 F20101221_AAARAY subramanian_v_Page_033.pro
58a7dad7b75e456ce2f1aad8d6567a91
873201359955bd9b61b8ce2252f45b3d6466e2bf
616 F20101221_AAAPME subramanian_v_Page_092.txt
0217e13e3499757178a112a753795bcf
5e2b0f316d66547ad6614dfc78e8861dedb91022
8466 F20101221_AAAPZP subramanian_v_Page_226thm.jpg
a60c1854109aac46e8d3ec5daa5850d5
5041b25db3caab7431b09ccf8505e18278ce9f8d
8510 F20101221_AAARTD subramanian_v_Page_042thm.jpg
35cff3513a2777e1a8da0150e4b9b11c
4e1111f8e2896f53a5c9e20505fb59f6dda7924c
50103 F20101221_AAARAZ subramanian_v_Page_034.pro
f45a3d62e14258a87f7941ca4c4dd673
8d293475f7b3ac525bd903bd7ccdbf9d3f1ec8f9
2016 F20101221_AAAPMF subramanian_v_Page_242.txt
a400ecbd26a111112446a8f162a038ac
6d8e30b10012016d723b033fc81f75a799f637cb
1412 F20101221_AAAPZQ subramanian_v_Page_133.txt
a2e4e6633165e43e4f6685fd335700ee
0bd7cd8fc2a6a1ffd127f0c0759134914da3190c
8151 F20101221_AAARTE subramanian_v_Page_043thm.jpg
b3487ca4a63820a9aeab04c05b20442f
16e2382e3ab879690c6b2951970958076f6c80f4
F20101221_AAAQWK subramanian_v_Page_157.tif
1d2e9861392be91acb89b8cc93b7187e
bce827be438f5ea02b37404564f724fc1164d687
19548 F20101221_AAAPMG subramanian_v_Page_308.pro
564c9bd47964b2fedd06f919abda1b23
6026497b8917b5bced4c52005c4d50421fb4e821
43316 F20101221_AAAPZR subramanian_v_Page_240.pro
7afa3202c6a89e7aefbb0b5895f0e99d
ccfe039d7f39b2cd30d7afa270c53e575072eb20
8620 F20101221_AAARTF subramanian_v_Page_045thm.jpg
ee0c2079c63dc3c9e719472eeb78b168
ad8b2741688eeb16ef44aba8fc0b88c195693f76
F20101221_AAAQWL subramanian_v_Page_159.tif
42b829dbfb866f0f97a21d14a9b7e4ac
20155ecce9894f59d97253d735e0890dcc91150f
1572 F20101221_AAAPMH subramanian_v_Page_027.txt
90d3363ff52940713796be6add553f2a
ab50059f2bfae2e4aed4c92565c184871f02989a
105961 F20101221_AAAQJA subramanian_v_Page_169.jpg
d2a39e790e35b83f26498020266828a0
f2957481cb1f9573b9631c993dce23abb880e2d6
F20101221_AAAPZS subramanian_v_Page_101.jp2
f85c033cf24b97f96406f8590e39c27a
9b836cc1dafb1d9b36f2901cc81dc431773fd3b1
F20101221_AAAQWM subramanian_v_Page_161.tif
73b95c5b5165bd114165df8b95e56216
99a08e9ce60eb00ffd4bd2ac737526011cb42366
28687 F20101221_AAAPMI subramanian_v_Page_182.QC.jpg
069a09949b317b96d36533346449183c
afa07ca5299c5031e83a1fb1f4b4a9a3548e5cd0
114261 F20101221_AAAQJB subramanian_v_Page_170.jpg
296753d96b6a430957a0494c60b5ed25
3b4fc4ddfa4e6000310736acfcd5c4848b2049ba
8514 F20101221_AAAPZT subramanian_v_Page_228thm.jpg
fcc0da75667216cfceb7718696252c39
73b3b2dcc543d97482b1c3b7e8cd9ebf20f376b4
35420 F20101221_AAARTG subramanian_v_Page_046.QC.jpg
ecca6acb10935bae4bd16bd82d636b85
84eb416481186c73796f03300dfdad91fd936cf4
F20101221_AAAQWN subramanian_v_Page_163.tif
76ad78423ae3766c39f2419edae144e5
65af4d751ff585a5984b47fca09d9502b27e2a45
89544 F20101221_AAAPMJ subramanian_v_Page_202.jpg
265a6627b87959e986d8ba1eb0ec1f93
4eef4c82e5fb6c5953ec4bb076f914c885657644
105373 F20101221_AAAQJC subramanian_v_Page_173.jpg
0a2fbd305fc5285db37d2a2e229d3618
5e66a8e196c2743a2d1c02932763ee13943a04b6
3529 F20101221_AAAPZU subramanian_v_Page_292thm.jpg
068afe457fa45ff7534bb94ebdbfb3c5
b1eabdf0012eadea4457fd8a0ca1cc778f4e7995
36584 F20101221_AAARTH subramanian_v_Page_047.QC.jpg
e7e49de78a61c56344a818a10394374b
544d65132a8229790c1d6dfd5c348d57a84827c4
F20101221_AAAQWO subramanian_v_Page_164.tif
6592512a2f8227c6795b137e54797bf2
be7b800f74ed8f7774afd11a2cab3524b46ca9b2
8648 F20101221_AAAPMK subramanian_v_Page_046thm.jpg
574bc1434047d8e33aad22ea266ac68f
f4054de4dfacb518848ba0b8cfae24d1eb5c4cc8
73541 F20101221_AAAQJD subramanian_v_Page_175.jpg
fb43326e44263f12cc3291e47e0004ee
9aceb32e00f3f9c28ba8df4627bd8adc957a206d
2386 F20101221_AAAPZV subramanian_v_Page_162.txt
2b4ba84f365c309bdf7de2bba151d1a1
35836c2aa500c8abb7aadf2a6a621b2c5ac4bd29
6374 F20101221_AAARTI subramanian_v_Page_048thm.jpg
3dd35c5839dae7234aa6040e9f98c29e
4557c19ab5803dcad4f1891955df3631d30635ac
F20101221_AAAQWP subramanian_v_Page_165.tif
6826ee8927e053b61334e4438357b41f
5b828f12a183950670527527066484394e18d5fd
F20101221_AAAPML subramanian_v_Page_263.jp2
2b4635e85e9148f6ee24ddfd175cf421
2991afb4ef567657779909e05ffa727140ca1108
77063 F20101221_AAAQJE subramanian_v_Page_177.jpg
9480e61bea632fee16247a05fb2f4ea6
f886a79dd6992ae23a359d57f97c625dddffa9ef
F20101221_AAAPZW subramanian_v_Page_288.tif
eb41ad920bcd31c22ad3ce58f653d79f
935dce14c1d012a41bd35ff2ffcd47687c8be2d8
38149 F20101221_AAARTJ subramanian_v_Page_049.QC.jpg
61d69e776c56c9c531f4c2b17b577f81
c4b7039449d3df2215372860a5dde421c0a7f8b7
F20101221_AAAQWQ subramanian_v_Page_166.tif
b6a7d4582ddcb9d5bb818eef152dab0d
371e6c1c4c3d45b083741a033d0139affd828455
F20101221_AAAPMM subramanian_v_Page_177.tif
8287680624f296bb494476a5a22b47bc
3b56d7fe9f28bd5a4b17bd39918362887f34b876
60562 F20101221_AAAQJF subramanian_v_Page_178.jpg
e0d6372e682d4dd7be3063c887f0fa42
5e9c1fcddb4d3665ab988d7660a922864c0c599f
105228 F20101221_AAAPZX subramanian_v_Page_129.jpg
1ed3c45dde7ed976946d28cb585ac4c8
c11c265829bf87183941b85fd468e8015ec4f966
7372 F20101221_AAARTK subramanian_v_Page_050thm.jpg
3f7d86782d9c2502ff6ccaacad65baf6
0e71d81bdc6ab02c1142ed2a80978a9f8679ffa2
F20101221_AAAQWR subramanian_v_Page_167.tif
f5b66fd60c2d8714bac0ccfae2b1a207
89ab643ac92e98b6ea5cbbec87c2de914c63defb
1051883 F20101221_AAAPMN subramanian_v_Page_191.jp2
01d38dcf6a7f9f1ad2446d61589ed806
3747d255efcf3a06eeab8b968b35f0528bbd8071
114697 F20101221_AAAQJG subramanian_v_Page_179.jpg
b70a4170a6985f325ca46cce7f518043
2ba36eba07f6ee02829e979fe222cc154fa1a3ae
1024926 F20101221_AAAPZY subramanian_v_Page_221.jp2
ade16bb88c088dec532235c4edae4950
37fb9e211a75864739d293f09fdcd822c0c05d37
33880 F20101221_AAARTL subramanian_v_Page_051.QC.jpg
5beec1407c966089d740b4f5386fd294
8c9a8b455cf222626f225dfd3a9ec48866d555ab
F20101221_AAAQWS subramanian_v_Page_171.tif
c1683a04eeebe995db81777d859d6497
8ba12acdca470fc5d337dc4e288ffcfac99ceb79
F20101221_AAAPMO subramanian_v_Page_262.tif
d1bcc962bfddc5c3af88ba6411e9f9ae
c820c0beb1dce5a018c65cc7086bb9a6a9502317
88685 F20101221_AAAQJH subramanian_v_Page_181.jpg
6dc8b43f966bebc4170c05d90db06ded
b93389e63d153ad677d10d47a6739cc4f56a9417
115665 F20101221_AAAPZZ subramanian_v_Page_107.jpg
747457278603376fdc523f64404c3c6d
eb6bb62d16e7f4ff7df8da84e32ad4fb8665d415
39280 F20101221_AAARGA subramanian_v_Page_255.pro
08964e5678aeaf5d36ccd4a83d444e16
1c0bc837fca459e90f5293e5192792d6cd50900b
26313 F20101221_AAARTM subramanian_v_Page_053.QC.jpg
e25f63130ecf1f2a881cf9c1f8c20a33
13442952a0a6ae7b5c51274fddd23990ae683c8c
F20101221_AAAQWT subramanian_v_Page_173.tif
4433998496e1fb2b7bdf4ed224d755c0
b0027042c04ec363538f64a95011b28f957649b6
10467 F20101221_AAAPMP subramanian_v_Page_104.pro
24150f9f53a68a45208e815efba93d89
296f45d143d5afe2a591421e2dcc56d00db56d71
93382 F20101221_AAAQJI subramanian_v_Page_182.jpg
98d68a945e76972db7dfc4f9aaa8c70f
e56022870c0cf2a78edcf39f446dc1a16236ca88
11473 F20101221_AAARGB subramanian_v_Page_256.pro
c90cb076cee19da714a1edaeb196baab
fc1fbce90fca813829fb2b388f114e1c3f62270e
35291 F20101221_AAARTN subramanian_v_Page_054.QC.jpg
786d3c04751e5378766a7e1a168e0bde
9dd3b33ad17282dabd818796ccbd414c39a83d3b
F20101221_AAAQWU subramanian_v_Page_174.tif
7e4812973608c6322de56fd872ddfc3b
7f8392fa9ba80dc6572525aaddd520a41192eb4f
561316 F20101221_AAAPMQ subramanian_v_Page_220.jp2
26b2af61b5ad7181e98d4483d617a412
9de1711d0d63e62f96aeb1ab4c277bc2ad72743f
59210 F20101221_AAAQJJ subramanian_v_Page_183.jpg
c936593ec554c5b564df4d1a3c37a1a6
fd50aff45d9236c61ac4c9a898dd45ef86297a02
61898 F20101221_AAARGC subramanian_v_Page_258.pro
e03a05573d1ec3872d5c922e53b3de8c
5d027cf5f9e77dd4c9ff040188b3aa21782e2da7
7929 F20101221_AAARTO subramanian_v_Page_056thm.jpg
779ed87ba183c31888b7b964be8bdd09
9305276dcc07e7f60183942b248c68dc4a3095c7
F20101221_AAAQWV subramanian_v_Page_175.tif
d1967b2115121b10b1ba2722f785f904
a543106b6514545786e5c9605bb05ba0149ffaeb
F20101221_AAAPMR subramanian_v_Page_037.jp2
a595952b81ec79858379f5f53cc92495
73c6af62bd5670877ad297e95d7ec2c909cbaa4b
94992 F20101221_AAAQJK subramanian_v_Page_186.jpg
52096b064e521059ff52dd543422d9ee
9c525c88b34400418ad57492c294cd44816edce1
59923 F20101221_AAARGD subramanian_v_Page_259.pro
822f94fa983f582e91487b02fa125be7
e1e640f4c3dcae38d6359bd2d83c981ef3e41b29
20461 F20101221_AAARTP subramanian_v_Page_058.QC.jpg
600165ce82b141caf72da9a72cb3fadc
517361a9303961f7bf6f6f227f9dd1c100a081c6
F20101221_AAAQWW subramanian_v_Page_176.tif
23448007d1480d82506a5423dda416be
417dd0d8e332e6352574fb99a70a9c35b72aa4da
31930 F20101221_AAAPMS subramanian_v_Page_055.QC.jpg
8c26113297967564845b6b9197423eda
12e726484ff7988221e49ed31ddad11b0b7817b4
87175 F20101221_AAAQJL subramanian_v_Page_188.jpg
0c74b42287ecb6ebb63f4c9676511bbd
d98f3390e46eb70a17c9d9ac10f96a97197aa2c1
42979 F20101221_AAARGE subramanian_v_Page_260.pro
51c11e40a352d0891358685ab9e090cb
8f44e0f6543c99514f0d727a59434b433fc34798
14196 F20101221_AAARTQ subramanian_v_Page_059.QC.jpg
6fdb4b46c3f72d8c84644fb0f7fbb1cf
c61f4dbbcc1ffe893e9b4a2c9a1262142c3372a5
F20101221_AAAQWX subramanian_v_Page_178.tif
d61a8d341a5198363952206a6bb1a786
52378ee5b5e2edb01227ea1491cfc405349a5c60
F20101221_AAAPMT subramanian_v_Page_142.tif
c540d4f054edc84ce3516ebd2a0fe1f1
5b1bef8be97b1bee0593b97b5ef041dad88efae6
102995 F20101221_AAAQJM subramanian_v_Page_189.jpg
b4921b779280838c20853d66543923dd
99872aedc22fcaab435e20451a84a4a4e9797d06
59242 F20101221_AAARGF subramanian_v_Page_261.pro
b25583d18062008214500338fc84236f
d4646e9d97e24fa3a882a718889a6c80432a89d5
4064 F20101221_AAARTR subramanian_v_Page_059thm.jpg
cc766938465a7640283b9e6690a0eebd
6fb1cb5ced36421c03ca9372338ba7a7f6e7f485
F20101221_AAAQWY subramanian_v_Page_182.tif
a9adfc963a731b5f7487ff0b52366213
9bca42c6571a1309e38de635145bdbc174a153d7
37105 F20101221_AAAPMU subramanian_v_Page_179.QC.jpg
5b4d51a0010e886243603597cc854da9
81bae554a186d424a398f93e9c61d137b9a5f148
102761 F20101221_AAAQJN subramanian_v_Page_191.jpg
5f351b937aed26ef6f07755b16b8e256
af41cdad8934fdc4fe72ef9deacd71adc071a5dd
26972 F20101221_AAARGG subramanian_v_Page_262.pro
2f089fa928a099e01619d0a5e8a90ab9
3d7718c1fb2dc87ef80be32aa5bfcdea972678c6
19740 F20101221_AAASDA subramanian_v_Page_305.QC.jpg
6283161311ffe0200db584724795bf90
b54d7f33aa275327f77fb3b679b6e697ff7991f0
7709 F20101221_AAARTS subramanian_v_Page_061thm.jpg
9c3ee6f2ee943377e87119d2ef6f0e9e
6c812be94d91fe9f3c7c780321a1fb04727fa32b
F20101221_AAAQWZ subramanian_v_Page_183.tif
1e1c7b90f7a2aaba14bbdcc4f1c14f4e
6359a1c14cac686b673eb9a6fbe8288873ce3237
1030521 F20101221_AAAPMV subramanian_v_Page_235.jp2
6c6dbd617aed050b82c3ae3121b55862
0b7a1c039626691340db0fc098f7de2b3f3c044c
113269 F20101221_AAAQJO subramanian_v_Page_192.jpg
649633dca810f9dbb45f1766d4d429d9
f29de11612216c6141b02d4d908607a0eb70fe76
55228 F20101221_AAARGH subramanian_v_Page_263.pro
5029e94e427ae9b2fa9fb8c9e4c8d898
3684f5d530763ce150cf9308d536f015f4291b63
5197 F20101221_AAASDB subramanian_v_Page_305thm.jpg
f3dc7bb224b95e85b31d15999b99a94a
0f4b1d958ac05ed3665708f8440419a6c0a6d342
7930 F20101221_AAARTT subramanian_v_Page_062thm.jpg
d91e5e6302550dbdd2747318e35be2c4
9aca110dba804cd7eb6176b29212e217597d6325
614557 F20101221_AAAPMW subramanian_v_Page_249.jp2
2d9f08755bef1ea0a1bc350123a5e989
dcf9a471ebf7e514a6bcd36613c2f14ae4914f27
109649 F20101221_AAAQJP subramanian_v_Page_193.jpg
425d8825cafa29520da5e9ec7d1da93d
c86deeed4fa2704f290369db36f5719fec205c02
49265 F20101221_AAARGI subramanian_v_Page_267.pro
0babc95392a422cc0b64d39ae913fbe8
08df0605aa70cae95a6e1c9933300b40c07aea2e
15290 F20101221_AAASDC subramanian_v_Page_306.QC.jpg
c454d453e0926c2dd742e636a81335c1
c4515fb8e7eb6a1055aa9c443f832213a2da3e3d
6855 F20101221_AAARTU subramanian_v_Page_063thm.jpg
1fc29974657f89f0e4c10e1f3ce6a09c
260781b3cafa6ba5eee7ab646235f6d4bea86441
F20101221_AAAPMX subramanian_v_Page_085.tif
e8024c7df878f69ccf006964b3838026
2c13588efc3f140b90bd8a11c7caecf3f90f3eef
110139 F20101221_AAAQJQ subramanian_v_Page_194.jpg
80e1e2bd78d3eae7b241cf831af3d661
a3fe36b1f493445454c16d17a5c4522991fb587d
24465 F20101221_AAARGJ subramanian_v_Page_270.pro
02553d3cb3308ae01b13ad2319dffb35
818fb6d0422321b53e933fe58c0d3037ba489927
4245 F20101221_AAASDD subramanian_v_Page_307thm.jpg
9c017803dfdc5493897dacfb16ace603
311da5e8b905de5b7f02039227218b67008b8775
20577 F20101221_AAARTV subramanian_v_Page_065.QC.jpg
937cfffebce52183e85819beababec80
df8f242d728ef2bfae90c8db79f550b42d28c36a
82476 F20101221_AAAPMY subramanian_v_Page_069.jpg
192e770e48c9e617d477ae7ca1488e7c
01af6b25930af45676026f87378b191515fd2315
115190 F20101221_AAAQJR subramanian_v_Page_195.jpg
b04409463d85d4c678bfda624c4c5236
f2b312f00630a7219068896c14819d15853200c9
26707 F20101221_AAARGK subramanian_v_Page_271.pro
c47fb612e010e59898751f6c0780b4cd
a38685fc8bf5f7365a4332182a240582c8b15b03
3553 F20101221_AAASDE subramanian_v_Page_308thm.jpg
35f25e27c286875ce06f239021fba5c0
e4791152ab2e253109cfa633e6b1c03ea861ab57
34673 F20101221_AAARTW subramanian_v_Page_066.QC.jpg
2413f1a2672f727184c8cc61d6b2b3e6
29f64a2b5eb3089e7324fab561868bc55d08889a
95429 F20101221_AAAPMZ subramanian_v_Page_052.jpg
455ccfd2ef4f1bf5934bba87b53a10e2
cec0921291fead1e7751259f9c60db828040628d
96334 F20101221_AAAQJS subramanian_v_Page_196.jpg
ab7635ef046dad1dd726029faf5458a2
a13e75b6ae1b23fabda5ff3b580863f03b7019ba
16763 F20101221_AAARGL subramanian_v_Page_272.pro
fe988d353fed7cc3128840011bcf26b6
3af8678e694b9836b6505db9958292329de6c8fe
34939 F20101221_AAASDF subramanian_v_Page_309.QC.jpg
7f23c79276c5e0702e710f2e1e86074d
0f8fc76fe4154a78eef8c20de4b0f514ac8b5cee
9068 F20101221_AAARTX subramanian_v_Page_066thm.jpg
d02119b94afbdf2880278caf90d37c7c
a793e1670da4aed4a73018c1644f17720b8afeb6
70119 F20101221_AAAQJT subramanian_v_Page_198.jpg
39bb57c3c25013adcb68eb925234e873
5e60a8b2bd8c205326b6655170cc2468f41940df
17593 F20101221_AAARGM subramanian_v_Page_273.pro
e1af1d13b6ee0c6520214e6b8866c9c2
79125cec3c230fd16001c9335751d0ae56e5014e
36581 F20101221_AAASDG subramanian_v_Page_311.QC.jpg
bdbf06d3b91b23adf1d2397bb4dd7cd4
b4f3366992c2ad35bfd319b47327e1d332b214b4
31144 F20101221_AAARTY subramanian_v_Page_067.QC.jpg
4caaf951caea545ebf1e35dedb5e08d5
35456368ef4f9f83309e4fac42f801327d575264
78124 F20101221_AAAQJU subramanian_v_Page_200.jpg
6d849f99d3f245fb021d9bec7285d2c9
df780a02688134c4006fb9a60d6863dc39a354a5
14696 F20101221_AAARGN subramanian_v_Page_274.pro
5de451fbbe6306178d0248cfe31216f1
467bff095e7e35bc152e04d73082ed6c01ccd685
34977 F20101221_AAASDH subramanian_v_Page_312.QC.jpg
21e204de709068dddb8604bf57536147
d79855275cddb680c0e168ca30468d2d417973f1
24071 F20101221_AAARTZ subramanian_v_Page_070.QC.jpg
c8858d2447c2382b3e6eb979e9533dbe
1299700f3db9d46d45a66db199436f9e38d5619f
100912 F20101221_AAAQJV subramanian_v_Page_204.jpg
81c8b2f07582e0601f9d893376d17c23
50ca5370423a775d044283a5afadedfec38838fd
23643 F20101221_AAARGO subramanian_v_Page_276.pro
b1ac8355fedd32065a8b4413de7d3f5f
2e948ae04728bd0ff4337b8eece5c972c7eb561f
33039 F20101221_AAASDI subramanian_v_Page_313.QC.jpg
cc78623f1b13994b57f12431bebab124
c68e3cefa059095863ac9408ed5c325b164d0401
87147 F20101221_AAAQJW subramanian_v_Page_206.jpg
43248f3c56da1bbe1a40baa1d03cf91c
294388fe4ab0f028de07e075ad9653a32355dc98
25645 F20101221_AAARGP subramanian_v_Page_280.pro
bd0503e5eb3c41150ba8b78cf7de2703
c7333470fd7bbce8361ae14de93dcb8ae18c92a3
8010 F20101221_AAASDJ subramanian_v_Page_313thm.jpg
c87f28f6e83a376ec7c92a5e22442ac9
622e9453e9ad6cfcb8a0720304f0e66db0bb2dd6
86999 F20101221_AAAQJX subramanian_v_Page_208.jpg
3774741f1dedbea0ca5495a9b040c010
646a54ed0b53e9b36949b422d472ed9fb7edc4cb
23448 F20101221_AAARGQ subramanian_v_Page_282.pro
b05cd1fc76e1ecc4e4f64924243e269e
4e5c626893ddd8d3b6ba7d024aa0a3832e61cf62
83349 F20101221_AAAQJY subramanian_v_Page_209.jpg
1e0b79bb97ad52762d6f29d45e3410ba
c1e40286eb67b9a863e7037b04ddfeb51feea1b0
22410 F20101221_AAARGR subramanian_v_Page_283.pro
70351bf75ca875aee1da2fdebbe13004
cab715b2e3836cf04a1f531442d382be82f5f344
99752 F20101221_AAAQJZ subramanian_v_Page_210.jpg
969feaf3f67e3124cb4c4aeb486a1a4d
b7f93fa19e4d6db833b5a91f3974022ae557cf2e
28927 F20101221_AAARGS subramanian_v_Page_284.pro
e1e4b46c7846c504397ca7990575b081
962b7a9d0ed7feb65d9529626b7651fe7f7a7e5c
33002 F20101221_AAARGT subramanian_v_Page_286.pro
3b5880bc3ab346f2eba49ce26b6b90c5
c396904a72d814a0ad0c22574368d75d9a6c1a01
F20101221_AAAPSA subramanian_v_Page_033.tif
394b50a97765b71ec62e2077e38f5ee7
8fa83a5f4995ffdb59c7ce58b91baca72c332451
29367 F20101221_AAARGU subramanian_v_Page_287.pro
73e872a4f9650f2925dbb5eff5e8f208
d9337fee487598e5aa1647071c5d05178c06f81b
F20101221_AAAPSB subramanian_v_Page_132.txt
83f34f9d9217b525f6bd1fa2e38ad710
64e0f4c79a948da04a51be185968eaa9fbe328b5
23158 F20101221_AAARGV subramanian_v_Page_289.pro
c8289cb28999910ab1e3d2ede7a5596e
d2940b4b1eb4acfa8ac2df3ee827830f6f74393f
2456 F20101221_AAAPSC subramanian_v_Page_312.txt
c01e08e90476220157816d33df499d35
adc18b16c80fa3df988173244b8e928f8354f4b2
20765 F20101221_AAARGW subramanian_v_Page_292.pro
aa2a9080788fb1d22f074e1112edc7da
3692f54f115163fc770b7804ad83734bd8746e09
29863 F20101221_AAARZA subramanian_v_Page_199.QC.jpg
0592e5a5d1a480b68bcc7b6729cede0d
63a44b4e1b854c74d81c44e69066f91f7cf63615
F20101221_AAAPSD subramanian_v_Page_057.tif
6b8ab468b597e0ac0d24aecb2be5ef50
bbcb6666435dfb348366764be5091b4e9b0a96ed
29996 F20101221_AAARGX subramanian_v_Page_295.pro
882ff42af7fb0fe5f8a2d89d9c692544
f698a1e03a209adc2d034c33c36154493a22da1d
6703 F20101221_AAARZB subramanian_v_Page_200thm.jpg
b82d83cfd7ae245af95cf9802d6488ee
25d6035b0afef1abbad713d87fafb27c2583c2f0
F20101221_AAAPSE subramanian_v_Page_313.tif
649ef7b17d1c7c766f7f7a6c061110b9
6ffae3d31c6518bb5735cfa59be8542d9676f904
24412 F20101221_AAARGY subramanian_v_Page_296.pro
22fa7abd186502289c74f6cdfd074336
25a184aa58b0e13c564b22f6b1e7343a92ca1999
35114 F20101221_AAARZC subramanian_v_Page_203.QC.jpg
c2823da14c60e60ac8daecfe56c1007e
93263ad290c1ef59e64f9d467ea431a4bcd941fe
36665 F20101221_AAAPSF subramanian_v_Page_172.QC.jpg
8a6d1a011a1f536694a3864fbb1777ee
fcb6379124faf14029674d859b02f02df8a47081
32845 F20101221_AAARZD subramanian_v_Page_204.QC.jpg
3c518d810d6fbc2615968272a06851fa
0ca1809363bffeeecc081efc76fdb92bd38352eb
7947 F20101221_AAAPSG subramanian_v_Page_044thm.jpg
d1ffa6eccab10cbc7c8f57c8d414c195
e8080fcfba2a50e77381508e57f5440b4cedaf68
32929 F20101221_AAARGZ subramanian_v_Page_299.pro
4fb961b67aa41560c0331e8564c2af05
06cc7ff1cdaeb3d7c4b7ee5fcff78ad0fee2eda3
8714 F20101221_AAARZE subramanian_v_Page_204thm.jpg
96d64718019296794b76d0e90b0b4aaa
3188a947fba1e083d05b39b0070c37813f74131e
1051905 F20101221_AAAQPA subramanian_v_Page_134.jp2
92e7f056456f009580057f6ee7ddd7e5
3569e99c49dfddeba306097396179231e1ea7852
30240 F20101221_AAARZF subramanian_v_Page_205.QC.jpg
f19a17326edcaac309369773bcd0378a
650ecea55364378381ee978a77b5f3e550b0cc8c
24858 F20101221_AAAPSH subramanian_v_Page_151.QC.jpg
98df210c3ccd7f2808db32e7eb3edd24
b3b9629bf49190b727f7635300a8794cf1fe5647
648860 F20101221_AAAQPB subramanian_v_Page_135.jp2
63821fdb491c4cd22b5a2c0d9e032212
35cad56189586a0c0dcb97bd94e8b7df6a435d7d
8058 F20101221_AAARZG subramanian_v_Page_207thm.jpg
6382d1136aa6858e7e65e08a8dc49d4d
444c55b971d5be42bee89af08800f8c988d5d00c
F20101221_AAAPSI subramanian_v_Page_229.jp2
01c01038d04be03a0fc53cd02531b294
4b675d154b6d756c307ebb92f64257877426ee45
1051913 F20101221_AAAQPC subramanian_v_Page_138.jp2
515158285ae5e01b73f7880b5ce0763a
c71ee28046ab75f8c1860e0ab9b1d55ed93a86ef
7133 F20101221_AAARZH subramanian_v_Page_208thm.jpg
20cbfd1fca16047c2b7d072038960bbe
93773ce666e05c2173fa9b817bcf4ba9b9f87707
1051911 F20101221_AAAPSJ subramanian_v_Page_161.jp2
b1daf3b874a27a4ae17e6d3d226147ee
0de071d404f4869e583ae04dfe25d054b445c1c8
7225 F20101221_AAARZI subramanian_v_Page_209thm.jpg
7c2a3c3e099ea1d78185942bec06bba0
32179e0758e71dd5edba3599a9b78466148fb756
15928 F20101221_AAAPSK subramanian_v_Page_271.QC.jpg
3405d99cbdefb99273580e8b72b8fb91
c7e2d95358484dea7f412122163abec2571e790f
F20101221_AAAQPD subramanian_v_Page_139.jp2
dc8ac7728e32fd929e103e72fff38d88
0bbadb3e763a79d618c9fa023f5e71d9b837c9c5
32743 F20101221_AAARZJ subramanian_v_Page_210.QC.jpg
cea50d175c5b954f78b0aecd8f61896f
c0b48e993f8764a4539bbc574e1d9d660ad94c92
F20101221_AAAPSL subramanian_v_Page_264.tif
be639cd59c744642d6742a8175d3e367
376a22d40643ca63ebb9b2023ee71d4092864173
503254 F20101221_AAAQPE subramanian_v_Page_142.jp2
f83a96bf9ab0231ae675f77bdabddf7a
1de8f28fb0ba8b5a6d60c39941570b6e60aec174
55770 F20101221_AAAPFA subramanian_v_Page_289.jpg
00404b436dcfdf01918a6869289f9779
bf636f01b9f23a741ff91d89f2ecf591bd82fc00
8159 F20101221_AAARZK subramanian_v_Page_210thm.jpg
62ab07f800036523c5f916461541972d
33595f77c5aa958351fda92b4988b08c104367c0
9052 F20101221_AAAPSM subramanian_v_Page_168thm.jpg
422bc579b70991001e1b30a1681164aa
0987496f79fac73f04176598c29bd7fbc3b7e396
1051941 F20101221_AAAQPF subramanian_v_Page_143.jp2
c4d30f4adc253fac1333d7aff679bb80
45975f3c2a65e17325186796c77590f18901dcaa
56832 F20101221_AAAPFB subramanian_v_Page_106.pro
35aa98fbeaa5a3391306e2a91773fe54
5396b52103962d9d677a6df4f4acec3c349c7ee6
9264 F20101221_AAARZL subramanian_v_Page_212thm.jpg
6dc89096eba89092dc834791e9c9ad96
860f723863d2d5a0c1b95a12d8ed9d4af87d89f6
31773 F20101221_AAAPSN subramanian_v_Page_213.QC.jpg
e48cc184fa46587c17ff0d93fee2abc3
171c82975c5a46d635c7f945c983a8c386d6a2cc
1040801 F20101221_AAAQPG subramanian_v_Page_145.jp2
ff170493cbf1905e6e16fd6c25618b64
a7bed58f226a350bf0b3909107605a8f6483576c
F20101221_AAAPFC subramanian_v_Page_306.tif
74979086f2915a7d5a07981507178f4e
613a4bf37781954a51e8689dca99a7027bdc6f79
1892 F20101221_AAARMA subramanian_v_Page_196.txt
3134c29dd43d27cf2ccc7b3320411e19
a35d6b98cbf934b1d77643be2bf174525de7e875
2092 F20101221_AAAPSO subramanian_v_Page_236.txt
c86f2dbaf3b0b52029b9e95f39e226db
acaaae67ac56c7f24c7a4401f86e0565c383b0e4
1051965 F20101221_AAAQPH subramanian_v_Page_146.jp2
5609f6d3d666682f3a8f1299fe0474ea
29ac5f5e4f8bda261c39894bd46d5cfe6e96cf6a
56508 F20101221_AAAPFD subramanian_v_Page_212.pro
26343d611937b0396381a7cf2216f345
9b630bdb8f3e54ca0c3ef664f4d69051a29d297d
30461 F20101221_AAARZM subramanian_v_Page_215.QC.jpg
e59267e373001a6928f222ca860c92f7
d89b8a90e47934c5e83be200a0107b6dccb6c348
1672 F20101221_AAARMB subramanian_v_Page_197.txt
5974def8721ebece43b42130d2a46732
a1aaa80f69281e0933226ca84024b6f8e54d198c
34605 F20101221_AAAPSP subramanian_v_Page_014.QC.jpg
2f3302a15fe1469af557843762a80302
b7f18e7449f8dbd820badfbdde067348f04d2768
1051946 F20101221_AAAQPI subramanian_v_Page_147.jp2
89ff703d14fef80c6853b69a4e6d6429
208cdf8bbfe7fc1dc641cd77c21b1347380eabbd
90694 F20101221_AAAPFE subramanian_v_Page_176.jpg
e79b3bed76b18a90408c16847648a3fb
735d93f6593cc839b1a866fe12a731b21144499d
6971 F20101221_AAARZN subramanian_v_Page_216thm.jpg
56c0ac53e4cf12263183e5127bbc37b9
cb0a301cc3f81fe4c723eccbd920af9faa98ca92
1540 F20101221_AAARMC subramanian_v_Page_200.txt
de1f3a6e3b4606b45975fbec39a17cfe
fd77d5a4ebc7c7a284e2f21f8682b36a7098d696
102592 F20101221_AAAPSQ subramanian_v_Page_112.jpg
60bcba0c5400327ac3ae6524de2d704a
5dbdcacbb72bbb7afd2a842b09f28c3cb4697dc2
F20101221_AAAQPJ subramanian_v_Page_149.jp2
a13a56928d8feac12194b325362da30d
9f0b7c22483d1a57c7f1aac95357d789ea0f5bf6
3163 F20101221_AAAPFF subramanian_v_Page_274thm.jpg
85d7ba0459de335f2817821490f9b804
eb536275d784a034aecc421103e7c9bbb8f7cbc7
17061 F20101221_AAARZO subramanian_v_Page_217.QC.jpg
ddce20adaed74508d0db35aee6c43bc8
73161cd1c29894b37db8132bb157af5c83299b9a
1812 F20101221_AAARMD subramanian_v_Page_202.txt
bcadd0fbdf22908c333ad19a761cd329
8182d3d66ac86321667474f45b0febb8b2a31242
34681 F20101221_AAAPSR subramanian_v_Page_080.pro
ca81859af8910905cc08e3cec6aa3fe7
0c58929a3c126b5fa15ed05043849afc4b2e69e2
676135 F20101221_AAAQPK subramanian_v_Page_150.jp2
d29a7be8edeba46e9a1dc271264e8e51
89d7a57aa6c9c83c1fd8844eb16f960b738e05fc
29949 F20101221_AAAPFG subramanian_v_Page_166.QC.jpg
8796207b6d680aeb78bf16325ca1730a
8f566b2b6693bc6279a2045694426ca71aa80216
4357 F20101221_AAARZP subramanian_v_Page_217thm.jpg
733b9e4d84f7ca061d0bfb36ab131f67
989791820aa565344224560e5964dbaf4472a67c
2130 F20101221_AAARME subramanian_v_Page_203.txt
a966e406154b50fe8273e2359e2582ca
9c33f310f7beb0a5e74fe98b1be8928161a842c7
108183 F20101221_AAAQCA subramanian_v_Page_184.jpg
8066899c73fd70968b694affa88dc8d4
93db6ff41161f40d0ed66ce400fe8f05de9a225b
1077 F20101221_AAAPSS subramanian_v_Page_053.txt
bc85396d122a1ebaa945e3217df88fec
9656b24860fdaabde81f4301185d9862a214609c
795801 F20101221_AAAQPL subramanian_v_Page_151.jp2
9eccbbeb38fdc304b64762e256048966
62f68b8b6b1d11452cb1908cc472c282c94533d8
F20101221_AAAPFH subramanian_v_Page_283.txt
a5704f0c948271345f7c06288db58b2b
26a3e40834d76bbe040e8390028f0891a12ac8d4
9111 F20101221_AAARZQ subramanian_v_Page_218thm.jpg
28a2d8feb36b289f4fee985c9df3f86a
7e4556b1641a92fb97ceec9c73cc66f5006b2d50
1974 F20101221_AAARMF subramanian_v_Page_204.txt
025c7f67d58fef420b5e4bbfe74eb4e9
e4f10c86332af719c0bd1739c11905c5d226582a
725575 F20101221_AAAQCB subramanian_v_Page_299.jp2
953282245002c1b6bbd7bb36cb767817
4cdeec8d5d578af03dd5325b078b9eb8e8e218af
F20101221_AAAPST subramanian_v_Page_007.jp2
74ba63882114cad38b5dc6e90206215d
e308333d6b3c9f545ffd3b1b8b3deb46f161eba4
1051967 F20101221_AAAQPM subramanian_v_Page_152.jp2
ae9ee3c211b4502eca04af4ed5a9ac91
c44a18613d142ef09e22dc6cf91d7b96521549c1
929862 F20101221_AAAPFI subramanian_v_Page_069.jp2
a198465031fe3ca9417912b0d19cd334
38bd118baea78eaff3968500b03cc5c2e0af3b9b
37997 F20101221_AAARZR subramanian_v_Page_219.QC.jpg
384fbe3c413040538c08f7250ace03df
348c572cd4fd95cacbe54324a956bf061e7ed6da
1886 F20101221_AAARMG subramanian_v_Page_205.txt
705afa351719ec7f9c55915a3de79ae5
55061efd5b15a38b50f466d9438f6d151e15f3a6
23061 F20101221_AAAQCC subramanian_v_Page_132.pro
7f6624368f9cc6fbf4e0a88ed3df1c3e
49cf64fdc5be4f32df0601d8afa56a73db99a2e6
1048758 F20101221_AAAPSU subramanian_v_Page_052.jp2
f38675e5f51fec0d0e07fdcf91a8d34e
8bed00ca5e9136c0b36a71d6b5ceeb841a65e5fd
303485 F20101221_AAAQPN subramanian_v_Page_158.jp2
b4886b830d34eb66a0ecca8f350b5e19
2a016e06d0cc051d7afcbb332dd73b42049b77cf
6108 F20101221_AAAPFJ subramanian_v_Page_133thm.jpg
6a375e52ce1f84310de6aafc87f59bcc
66ebc42c908c06e5e41bc16be02673b874f929e9
19290 F20101221_AAARZS subramanian_v_Page_220.QC.jpg
6118f119ede0dc0031d151b588d771fb
b7cb142950febe4f0f6188f8fcff0c86255faef8
1766 F20101221_AAARMH subramanian_v_Page_206.txt
1088cf8d1a2c2f5f37274c1be836c728
7fadfd8c5f3540fb37b306f4a2cace2ed53d10ed
30429 F20101221_AAAQCD subramanian_v_Page_297.pro
473d48ecf41c7e18684334314de1db90
6f22ae2128674873bf33fd5159504e2f0a92ad03
3855 F20101221_AAAPSV subramanian_v_Page_294thm.jpg
484381c8b0475d86303ce314e06e5340
143b79278d1018daca744bf5fe176f13bd30e291
1051924 F20101221_AAAQPO subramanian_v_Page_159.jp2
307c772683b0a80e99ae02b3ff0b53b3
f36872417b4c38f370fbff04e85a122d620805cb
81093 F20101221_AAAPFK subramanian_v_Page_050.jpg
eba72d1ea059ef4380515d080fa22b22
6e42b6165c9aabbd8a5288fd30445f988cb63f75
31493 F20101221_AAARZT subramanian_v_Page_221.QC.jpg
9ad3347ddd5b4b056ff4a970056b883b
8e3a25393a26ab4fb74d8c926be2faad5e5feb1a
F20101221_AAARMI subramanian_v_Page_207.txt
deba7a0c0b5ecf4f8c156e9832aeee58
65908f13b9c45aea4149ca7bd35679a5a3b731fd
44961 F20101221_AAAQCE subramanian_v_Page_118.pro
12b216df87327df6ce8b0382b181b417
15192e37861f83734cbbd6c830f96c0268dd1f57
37990 F20101221_AAAPSW subramanian_v_Page_197.pro
c6111aa130119e701558cea749e0481d
75ac4aa568cf8bf37e72b294f85802d0e4da6bc1
F20101221_AAAQPP subramanian_v_Page_163.jp2
d18ca6464a09814c51b2d175e9c19015
83a87c125400f8bd9bb4e3203eef11558f8935e0
28460 F20101221_AAAPFL subramanian_v_Page_096.pro
a91966dc2e6110bf4a9d77bf78e3a3be
447ffb03b77f0dc88878eb096d6410768218b016
8027 F20101221_AAARZU subramanian_v_Page_221thm.jpg
f4ea5a2b764f38ecb7f8001fae18316a
d51fe8cded46a6ee4c8dd8b4e3784fca85a04c82
1440 F20101221_AAARMJ subramanian_v_Page_209.txt
c615b621344ccd8891cb39bb59bd8300
c4d2cf102a83ee7faadea7d927f30d5fbd4c0811
95771 F20101221_AAAQCF subramanian_v_Page_031.jpg
aab1f663b64e8ead476265c04f1c7e06
892a8af50a4efdb6e5b77a47af57c00910c7ecb4
27755 F20101221_AAAPSX subramanian_v_Page_291.pro
06a516802805234beb8d99bf4ae79afc
0837c73035f1cd1c803af1c9f0aa100f09465691
F20101221_AAAQPQ subramanian_v_Page_164.jp2
1183e8c7f4b230c7195953c3ce2ae33e
a5877a38651a2e386fc74bc8355c13f90a3f01e8
44294 F20101221_AAAPFM subramanian_v_Page_166.pro
47d1190c004c4560c694e4a2658f6498
6c5c17fb96412c3de824cfcdbb0f053aeace7eef
23438 F20101221_AAARZV subramanian_v_Page_222.QC.jpg
c2578bdc3f946fbd91cfcdfd7711c2bd
dc82c064bc3f5f1009f712440d5ea73d938cefad
1669 F20101221_AAARMK subramanian_v_Page_210.txt
1a4b193149ed9f333ac1dd23cff537e1
1bd99ae64aa47abd83e1703fa1171621cd2933b5
1051972 F20101221_AAAQCG subramanian_v_Page_016.jp2
be720c3f249dff199ea0579033818650
98f01a72e5669a9a9180ca454f3a0cdb8c42c53b
456757 F20101221_AAAPSY subramanian_v_Page_092.jp2
ee159c7fda48b5e5f1addecd0c069188
1d2fa3a0738f7d98caeb309d662b1e585671f8e2
F20101221_AAAQPR subramanian_v_Page_165.jp2
7393c52c72631276d3e318170e3d8085
35ca4fb8716dc47e2d2029a02df8ac8284197b9d
1200 F20101221_AAAPFN subramanian_v_Page_025.txt
87ccef4191f23ff2e9fb4546177d2de3
84800b43cde9ab6c30b91fe0e2bf0c1ee9f6eb4f
6908 F20101221_AAARZW subramanian_v_Page_222thm.jpg
dc7f2783b883135424a81801388ec58d
1134f20d4a39d0a11e779384893a8071f364fd90
2250 F20101221_AAARML subramanian_v_Page_212.txt
96c558a7653835390b05868e9fcd06eb
3d2242c7cbc34672ec4e9ce14733b7248be4a668
7416 F20101221_AAAQCH subramanian_v_Page_008thm.jpg
bbf69261214fa9015db9d347cfdedd98
8bc24e66afe47de76304bb04557fc8b5de784f7b
25145 F20101221_AAAPSZ subramanian_v_Page_130.pro
6e7d83feaf6eeff6d695000ac552c6f9
9fb095a8949ee36394e8a82c2788644bccc85b3c
F20101221_AAAQPS subramanian_v_Page_167.jp2
9816d1d85bc106a2289e2b072d7e58f8
78269185f7b827621c3fe906600bcc6fe3b75027
F20101221_AAAPFO subramanian_v_Page_307.tif
3979ca02cdad9b729ef658394567e309
529d52c581be2141fd50f3dabe96cc6522645fc6
6063 F20101221_AAARZX subramanian_v_Page_223thm.jpg
ad7716fd837f109c1b363dacffff2e81
b3372db39f5544e815aa25e84692992203cf0329
736 F20101221_AAARMM subramanian_v_Page_214.txt
a3940c6d1508abd87580ae06900491ff
4a0f8054723bcbf891fa67f2a23393e9fcb360fa
1051982 F20101221_AAAQPT subramanian_v_Page_170.jp2
17f52eae6b4aaeedd3115fbf535aa9fe
2fe2adfbfd7b7cd59ae8eec71083fdb6722e0230
2332 F20101221_AAAPFP subramanian_v_Page_314thm.jpg
3f7bb4cfd7df569e4b58bafe96b9a754
53ffbaa84b3391b580b521e5a0bdfdefd3635c30
51498 F20101221_AAAQCI subramanian_v_Page_265.pro
aeeeabb55cf958768f01bbcef1a48700
24962bfb66ffa2cee65b25630b8dae841666024d
6415 F20101221_AAARZY subramanian_v_Page_225thm.jpg
8ea933a42c67d034fc73d78a48a51665
1a39ef4dba40c305f3f32844f0d96edd17f0dcc4
1801 F20101221_AAARMN subramanian_v_Page_215.txt
a17a22d77a24608b6b346b0d3e56336a
f5d177f3b47c6c9bd31efe73588bdb7583864420
F20101221_AAAQPU subramanian_v_Page_173.jp2
2843cb45e9c4c92a07d57532d241def7
b7b297e374df0c1799245cc33990d2830ea59353
8882 F20101221_AAAPFQ subramanian_v_Page_310thm.jpg
40377ad05a7335ce321dd62887bf00a2
4978518cee8bf877655e8335d266f64b09b22114
21655 F20101221_AAAQCJ subramanian_v_Page_298.pro
045d1c28a36ef06a739ba051d1948e2b
9dcd05f70c010a37bad3612f4fbbc1a49e7a2a7e
32417 F20101221_AAARZZ subramanian_v_Page_226.QC.jpg
d59db294e118451afa8cb6b8c773fbf7
2c70eb2c50588422d3ce52666fb60386f0aadbbc
1170 F20101221_AAARMO subramanian_v_Page_216.txt
e8597a0cbe122d92afb7a408f33f3858
384f60c03dcbb6689cf4d88033dc3f85b1691dab
741840 F20101221_AAAQPV subramanian_v_Page_175.jp2
b51298bf45a685682233cad05481683d
140168b77df400b7ffeb564c13b3d3c30456d758
7623 F20101221_AAAPFR subramanian_v_Page_001.pro
cf9034e19ea774ad440143d6be667590
cba9d8468b7b4353543c11f72fef412aebaaf4b2
37875 F20101221_AAAQCK subramanian_v_Page_037.QC.jpg
5704e4c9b8cf0699d1cc77ac60b11518
d34090dada81d9cae32968c0cc1d36ec5d6144e6
F20101221_AAARMP subramanian_v_Page_217.txt
3a50283b32b89235c608b607b73dfb9c
89a0348fd8a4e73bdf0d28be6429bf9908292399
572138 F20101221_AAAQPW subramanian_v_Page_178.jp2
ab3e681147e81b18d88c7077401c6fbd
6ee2203859357e47aa912c09eb222e5d8ea98ffc
904185 F20101221_AAAPFS subramanian_v_Page_125.jp2
ffeb6408a958f32608090be8fd703495
eff61c34788489e806fb6ab9356c057af389448b
69227 F20101221_AAAQCL subramanian_v_Page_133.jpg
452668d631f4b853d27db81b088413ae
3d91918019ded648580980bc97f16ab20874991f
2191 F20101221_AAARMQ subramanian_v_Page_218.txt
91f506f34ede318987e3ac9af6ae16c0
5471ec231181bea57e5b77eb612eb276f1c34c32
F20101221_AAAQPX subramanian_v_Page_179.jp2
90fc67bf5a8637e9f9df6bad33e19be6
f5ade0972ce74a16a466ff52084f91e1237b9f3e
1183 F20101221_AAAPFT subramanian_v_Page_220.txt
5d0a5cfd97c22e2c551ab280e49af56a
bf08e9827249aca8b5a3ad073ee7bf2e593a0059
1863 F20101221_AAAQCM subramanian_v_Page_120.txt
2625fc66e6f4589ed1d58debbf347ed6
30d7d7d7a4b0f868a8db6a1aa78d5277c09d577c
F20101221_AAARMR subramanian_v_Page_219.txt
71a6304ab72c4a38a41c59d85436a42b
31c790eca4204e321240ee72a0b26e63bff1e4a2
940124 F20101221_AAAQPY subramanian_v_Page_181.jp2
757d757d1facd220c337ff7085927213
bbbb46cb1be25d2e61390df9e769002535ae21c8
30275 F20101221_AAAPFU subramanian_v_Page_026.QC.jpg
b1b211e3d1b15213d0a3b23bb86543a6
6abab49503963c924e5eb951ac592eae94e44d66
29254 F20101221_AAAQCN subramanian_v_Page_227.QC.jpg
f7ca6103d074ef0e950841b82cc68469
1164e7e03eaafe589c493a6a8ac527fd5ecf77ba
1849 F20101221_AAARMS subramanian_v_Page_221.txt
8e5cb116f0fcb5ce6d6bcaab7d600653
4f18573889152c6f5fdfba0143ddd73c3b9a7168
F20101221_AAAQPZ subramanian_v_Page_184.jp2
270948ca365139b1d70c1d0e928d12ba
f017dd27145e44882a6dc9684168422bd29a2f42
21698 F20101221_AAAPFV subramanian_v_Page_029.QC.jpg
8327691ba701ae088d33b53b3d749b99
c9b9be6e13a56b9d080d0bea484717a4288df4cf
2204 F20101221_AAAQCO subramanian_v_Page_268.txt
fdeb923eb39e38c377ea6e04ab99e607
c4ffb5b5a7290fc6b426c09e9f9c6578a96cc100
1455 F20101221_AAARMT subramanian_v_Page_222.txt
ece958ecd90e5a5a57fb5b2a28d80ff9
748fc1c66d6f603a2281fbd18113be5b309b9eb5
F20101221_AAAPFW subramanian_v_Page_042.jp2
f566c21e13d5c4975a74ec3f4fd95f4b
dba45ca09550d943208eaa4257d95575fe0efc55
1589 F20101221_AAAQCP subramanian_v_Page_139.txt
681703613390da51cbed8af654991e77
d7d564ef095dcea1ef6405df1d9c2cd5e761668d
1459 F20101221_AAARMU subramanian_v_Page_223.txt
c1ee50adee70e08738f3292895811350
68e3c9be12607e62b3ae46b09c912ca4ab502e43
22941 F20101221_AAAPFX subramanian_v_Page_222.pro
3083081b06779b952fe919880c96bc7e
fe64260f57815a97ba755f9007e728a52e555083
F20101221_AAAPYA subramanian_v_Page_105.tif
1365d25485bc91d4b11ddc119538fd4e
42504521ffb9f0371de88bcedac835b473bb9795
F20101221_AAAQCQ subramanian_v_Page_281.txt
af3f78d303f1b584d711d216a1e3af00
9cf9308553ab90916a205f85295b32c1c8e07f8b
953 F20101221_AAARMV subramanian_v_Page_225.txt
8052db75ea2dbc6f413d6f3e6a6f8352
6708b97e1ea7d7351297b54dbe6a5fce8f392ef9
68372 F20101221_AAAPFY subramanian_v_Page_064.jpg
cc96fa1bfa158b387a95a773288898d6
b49c14452728adc554e2be46240611d1c524e5e6
F20101221_AAAPYB subramanian_v_Page_031.tif
a297daa920452efbf6933be433beef9a
aaab19308ab30da366d40c34b9d2b5a908dc4605
1051975 F20101221_AAAQCR subramanian_v_Page_093.jp2
8ab4fe2879fc08df4c71af94ba1527bd
e0d05b67412c5ba6b27592e17a4025556887a03c
2044 F20101221_AAARMW subramanian_v_Page_228.txt
a6fedcf4fc00efa26e8f2223ce3e7df3
e6a7f336e61f3a6373f27337a28e5def6e729754
F20101221_AAAPFZ subramanian_v_Page_149.tif
264ebfd4917fd57e3017ff7283ab6364
3877e034239be5fd23436e21dff9a0db61ee44e5
1895 F20101221_AAAPYC subramanian_v_Page_026.txt
b9190bb2a05e7d71f1a5157a19e47ef9
2f444bdc23675788a2db09213663491518411833
F20101221_AAAQCS subramanian_v_Page_169.jp2
50366980374105774618d9080b9c7531
014e9d883424cf006a651100b2518df5fb8ad848
1482 F20101221_AAARMX subramanian_v_Page_229.txt
a04661aaedf60cfcd52fb3041f30ba54
4bcdb428181d93e15c085f60160c2ccec558c7b4
1051949 F20101221_AAAPYD subramanian_v_Page_112.jp2
abfd67afb727e85374bc0e9d035c08bf
5047a614a3d01eff6df89c72ef6063d4824a0eaf
F20101221_AAAQCT subramanian_v_Page_144.tif
005688122fce3cc831f926d54a3905fd
83d90be97785bd3ed4408514f43e444bd46b16ce
F20101221_AAARMY subramanian_v_Page_232.txt
1d3dd32974379e4d9737eb19ae600a71
104341363f2e8f14f160b5b8ba5525b766a2bcee
93662 F20101221_AAAPYE subramanian_v_Page_141.jpg
0cdd345b6ce1e2fbf214369ba95019b3
d9d29f2f8d9424124d92ba52178c96e5e0e13b7f
38303 F20101221_AAAQCU subramanian_v_Page_186.pro
a01376541c5a789e699a79b80829d302
098a6547187ec8867b6b20e1270ff0e8e25d7133
816 F20101221_AAARMZ subramanian_v_Page_234.txt
f5c6179ef15c048aea73b6e7c417876f
68578bff10e89955cf812b9beddecaf721ce9004
12306 F20101221_AAAPYF subramanian_v_Page_278.QC.jpg
32ef86341751b8181246f19300e35f7e
b929065074c37d86b1de880465dac9b1789f337e
2081 F20101221_AAAQCV subramanian_v_Page_122.txt
36f66fb9178d6364c05d28993e777d13
f67d92f1a1d1f15c5955febfd92fe27ccf3569e8
8136 F20101221_AAAPYG subramanian_v_Page_170thm.jpg
f93d139940fbe5fea71ccceb2a4b095a
a6cb5a6ea104ab270868946d862276f26cfdc12d
F20101221_AAAQVA subramanian_v_Page_096.tif
f32f1a38a1ffd7615f3eb27be45726af
66d9796d7b8f2d9a8dd8db5e991b9524f3ccdb07
108081 F20101221_AAAPYH subramanian_v_Page_148.jpg
fee88d2f9e2dcf2bf59bb1c97918ccc3
ae806efc1018d7a8cea4644c824a7a6266037d68
72727 F20101221_AAAQCW subramanian_v_Page_078.jpg
4de9d251786c5acf7d0aaec1d4cf2750
3e67e61767bf7b28c524da7bf882da2a704eef63
F20101221_AAAQVB subramanian_v_Page_097.tif
3bb1c722c1cd94dba3f9b31ae575efc3
3e0d28893c92e2c2bea73955f0d9fcaadcb2b069
49844 F20101221_AAAPYI subramanian_v_Page_279.jpg
f9d0aa4f37fc0479c49f13f19d43c676
fc6fcb1dbad1d17373312658913d5a0953954c72
8443 F20101221_AAAQCX subramanian_v_Page_123thm.jpg
92c1aaf962b37a45a4a55d6185109b35
b188998470290b974fe08b41d059354b84e643ce
F20101221_AAAQVC subramanian_v_Page_099.tif
3fb744320ad6497113c7ae6eed981af0
cb0f9fdee1879cb66fef8a0dd6867b170c8820fd
27697 F20101221_AAAPYJ subramanian_v_Page_124.QC.jpg
79ea50ff82fcde44a99b33dd5a46b03a
35d0afdcb8407512fc69f9517fc27b5222b6f1da
17204 F20101221_AAAQCY subramanian_v_Page_081.pro
42929ce4857c11898be4a3849ac77b01
20c1ae5a022bb9a1b0bbefb5f78f6edc21eb3d96
F20101221_AAAQVD subramanian_v_Page_101.tif
f382e50e3176b093440d4d02babb0c21
791981104fa8d98bb8267c9884efd186bdfddb33
1476 F20101221_AAAPYK subramanian_v_Page_036.txt
fbfa8ea36279e7b7870a62359058afdd
15d7f20476c2190bdbe615a25f8d7a73e7c7774e
911 F20101221_AAAQCZ subramanian_v_Page_233.txt
404afaa5732402689ac72483840f8f73
640c5e325ed326bbc9e464e5d60534ee474a4664
F20101221_AAAQVE subramanian_v_Page_106.tif
34062a94941d3c2cb161c858abde0cfd
48fdc7f07f52345e0ff213d6bc132784db4d4293
F20101221_AAAPYL subramanian_v_Page_289.txt
184bb0045f991f4de53b66e8394a60c7
5bc7aed335652fc2a20a974e8669160569e420f9
F20101221_AAAQVF subramanian_v_Page_107.tif
efdce5f197eb8f8fdfa81f3687d85b76
8740dc9af0910c285352beeaf87e4fbca728de20
54598 F20101221_AAAPLA subramanian_v_Page_245.pro
b2c6b3897d864d9ad551c7f2841e5c4c
0702a03ae52bdf19ac1327bdac3cadc6f4eeafe2
23872 F20101221_AAAPYM subramanian_v_Page_175.QC.jpg
cb2e256f280097b75cd4b19afdc98309
c3a2d16e4947ec3174ea6c7e9fdc085072fcddc8
F20101221_AAAQVG subramanian_v_Page_109.tif
54e060b940722c242886b7773410884d
712b878fecdf3cbb872732c202432a8695912cf1
112144 F20101221_AAAPLB subramanian_v_Page_161.jpg
fd6bd0fe119cd3030803189fa90ac575
b5d948c0b0760d046cf069498f8b0569fd76e82b
21065 F20101221_AAARSA subramanian_v_Page_019.QC.jpg
590d6fd99c3cdfca8cbaeaee4c034268
63af195bd7eb5202189e1d9d577e2f6d12230971
F20101221_AAAQVH subramanian_v_Page_110.tif
99e82372be947d0012fc593fba3cae40
8e22c828aeec582c3226ecb43e025be3008284f6
F20101221_AAAPLC subramanian_v_Page_020.txt
5ed05e32d622855eadc70ecff584900d
149b65393b8bd39909dee65ab2edf738e79eeacc
29637 F20101221_AAAPYN subramanian_v_Page_073.QC.jpg
2d9ead253c66ec5c448afd6e903be4ce
8af17f136a61b6fb771ce6945462e425c336b7b6
5116 F20101221_AAARSB subramanian_v_Page_019thm.jpg
b1942bc4efa1c608e34674da90b8a919
73a28afeff47a8bce4509826a12e91b7e0f2d4ca
F20101221_AAAQVI subramanian_v_Page_111.tif
89cb56068a4085ed3c57b01aa7e3fed9
1bc8eb0e2a19043984e412b7e93f8c4e3bf77c10
F20101221_AAAPLD subramanian_v_Page_215.tif
f95b59864b8a4cf9e894a62760a106de
89c652cc2f8a028639c4cb9e4f43493ab3ce00c1
F20101221_AAAPYO subramanian_v_Page_024.txt
18f28cc69b5cfd557f5f41498df4c1e6
1f443d7203c25688b5dd3b247ebb6320ce6a8b37
36564 F20101221_AAARSC subramanian_v_Page_021.QC.jpg
045ac0b3553e069991ad051fa99a551c
548de89a23de7132976ee97232f776f5ae6f1078
106871 F20101221_AAAPLE subramanian_v_Page_030.jpg
3d958ca7dc2f919f23481614eb6bc9d7
e0254b4ee5876712e642844e62023dc85042c6c8
102775 F20101221_AAAPYP subramanian_v_Page_226.jpg
15ad7ba2dea64a959dfd3fd0f97061b1
5e7ff4fbde672e0bd2024e96d93dc5021ed6dbd0
9222 F20101221_AAARSD subramanian_v_Page_021thm.jpg
0592d38b8a4e21c8bd597ae99c6de9f9
960d6d9e9f58eb79194132cb2f25774e54f0706d
F20101221_AAAQVJ subramanian_v_Page_112.tif
386faa8eb6b8c165e17638995c4e226e
cdd177ebef64d28c7d09a0812779bc4689e1c96b
912054 F20101221_AAAPLF subramanian_v_Page_206.jp2
96489d2b5063ce3b1838d9667b879c93
df4701fa02bebcf231fd73762600ad31e770145d
31691 F20101221_AAAPYQ subramanian_v_Page_126.QC.jpg
4d4191d5dbba42edd591e445203b7271
64f9d401dbf28f417586137c0cb7b22c0f614467
35003 F20101221_AAARSE subramanian_v_Page_022.QC.jpg
0407c559727006910040b65a3dca1707
14059a01e866a2fd27fc89b8d2af33b6d85ca960
F20101221_AAAQVK subramanian_v_Page_113.tif
1d22d4f5b60e2d9dfe616ba5ec82bef3
bc7e9c83538b7843f52b4ff718b1b2b8fd8e2404
3667 F20101221_AAAPLG subramanian_v_Page_293thm.jpg
9421319bbb31059e7924fec15fb8d78a
62e99f2902dfba15dc535d93bd848e1a89d8e2ba
60886 F20101221_AAAPYR subramanian_v_Page_312.pro
2185f2ead408696f0aa662e3fe3e60a8
e92aca4ec26a4d477f65a3a6de9352609d91cad8
F20101221_AAAQVL subramanian_v_Page_114.tif
5e3a9b8ca9ab6a93c071dd3bdc3d30be
60695ea8a6a8500a74291f2d0b0cfdbbf7b9e0cb
21849 F20101221_AAAPLH subramanian_v_Page_064.QC.jpg
62092b95f5a078779a4ec6c4b93601b1
da5333b90e5b9cede57893fdce59e84f958d48d6
90750 F20101221_AAAQIA subramanian_v_Page_117.jpg
7fe93ab1afb94320d38a88a05fad11e3
2037662b5890d06ad474939224aa95be36cc61e9
978593 F20101221_AAAPYS subramanian_v_Page_257.jp2
f6062015cb60df21654fae1e5ae3465c
50a0c7ba17c86d1240dd85994b197990d7c68f6e
34051 F20101221_AAARSF subramanian_v_Page_024.QC.jpg
4bc26889f6e0d0f5837abfb7ecee884f
bdb456fd01902fb585a5916a3da5abcb2518c428
F20101221_AAAQVM subramanian_v_Page_115.tif
4f7ddfbe41b8e270bf64d8cf6575720f
ba59d4f4530d2d2c635aff5fb1fb5770ab2b1ade
7289 F20101221_AAAPLI subramanian_v_Page_124thm.jpg
86c220c29bc249411883ade5c179f667
a972422f748eb6106a5a47914147f78490b1ab8f
88730 F20101221_AAAQIB subramanian_v_Page_118.jpg
cdda952d08d44c547ee0c7d48bb2c802
743b42ccac991c70c67651f7392cee71b1430f3a
F20101221_AAAPYT subramanian_v_Page_294.tif
2e2e286e48dc5cab1c303a943e4fe88f
134f26ef6133fedc97b8c074dae4351072ca1caa
18317 F20101221_AAARSG subramanian_v_Page_025.QC.jpg
4b1aa2343b247024f06d5f15e647db5e
69f46512018152924f97afa5e141ef51cfa52624
F20101221_AAAQVN subramanian_v_Page_116.tif
ef1f48479c55b60ea6cd56fdeea6241d
ec4c0a27e016cb17972b9a74a1e275a76b691fce
35159 F20101221_AAAPLJ subramanian_v_Page_153.pro
0a20b6fafce03a401c32a8f3a9ce9136
c67c2ea860fecb0322202d992d277cd7e30f7062
78131 F20101221_AAAQIC subramanian_v_Page_119.jpg
0fab0278c7bd5e55c3e3a9b5132c41a2
a3d8a7ba689ac1d9ca8499674fa73d99b3d20b64
F20101221_AAAPYU subramanian_v_Page_006.tif
7ef577d9f6a77024f84d151d22bd529a
825135473b9fe2d2b37846b579d9e20e44acdd5f
29878 F20101221_AAARSH subramanian_v_Page_027.QC.jpg
e1deb981847d7044d1c9500b1dd8c6c7
51535661d7110d7a4232ce091d6009f045b039a4
F20101221_AAAQVO subramanian_v_Page_118.tif
a70940c3ede36b30010bc6792f7c7fca
9a9a27669cbc0aff6e20d4f8dd5ea43ae407ad5a
19015 F20101221_AAAPLK subramanian_v_Page_160.QC.jpg
a51f3942fd6adc6764fa04d8b64534f4
2fb5fa4bcaf197d5bd3cd96b5209c3d30df20b61
96197 F20101221_AAAQID subramanian_v_Page_120.jpg
fc51f126d2eafacfbd4791b448d05de7
4277509bf6dcb340b774ed331c83ff34355e4768
21253 F20101221_AAAPYV subramanian_v_Page_226.pro
6fcc855ed40aa95f5d4b7d579a0c9462
e8f40a59b36b1f708eaae9e8d3dd867388b589dd
8003 F20101221_AAARSI subramanian_v_Page_027thm.jpg
d7a9727b037045fe8ff17c9d94789af5
914d285ac8e61e670be502071f3db98c00c25d71
F20101221_AAAQVP subramanian_v_Page_119.tif
dcc92d756853a623a0e40a45b03ee676
c14f2b2d1e0bddc5356f6b88fdfcf21f85ebc0d1
79257 F20101221_AAAPLL subramanian_v_Page_080.jpg
df1495661966144b106b3ef45df1560c
05496026949ef9b74f7ba26677e847e0f753fb16
101543 F20101221_AAAQIE subramanian_v_Page_123.jpg
485a08fea994b27c29727d63250f4d99
207b9a0f977e451745a5c7c47e0bd1605116d8d4
3221 F20101221_AAAPYW subramanian_v_Page_008.txt
db34707c9ff4c7a6ae67a2669ee4bddb
cedd9a505ba37649daf869f36731c60242371337
33567 F20101221_AAARSJ subramanian_v_Page_028.QC.jpg
267cc6a3f577db2855016285e6f468f7
53d809dda2d14caffb2904a491fb1dbd4a37b6a0
F20101221_AAAQVQ subramanian_v_Page_124.tif
798d6729c47bbffee6764eca234add0c
036b8d305008b9634ff86731e5b5915cbf79c70c
102918 F20101221_AAAPLM subramanian_v_Page_266.jpg
69694c423522c8b9ca610b73598b9f6f
f6e89340ebca62612191090ff7ef0ed013a1a933
82600 F20101221_AAAQIF subramanian_v_Page_124.jpg
59703b6b4f0a3e3cc0fa834cfd8ed6d3
3fd8229549715a2fa9e74a60efb62f636f39383f
F20101221_AAAPYX subramanian_v_Page_088.tif
ad6384514ef80b622ff990858cacb9d7
1ac3defd0ea69f8d3563a6e7c288d19dddeccb2f
8123 F20101221_AAARSK subramanian_v_Page_028thm.jpg
aac344bc84c994905a90a245a3d474f1
d9f81e8386378151de3f263c52f77cff0201c806
F20101221_AAAQVR subramanian_v_Page_125.tif
677a08276230cb1b34bc71a3e9c905f0
ab87eab1c5a87733ab5865128260ce5fd599045f
29803 F20101221_AAAPLN subramanian_v_Page_018.QC.jpg
c1fd52149805d30d2ffb64bd8a0a5f6d
7275c4933e0892d4ca9506f5e5feb94f22a51160
85311 F20101221_AAAQIG subramanian_v_Page_131.jpg
e7aa77bd00aba8fd4b2b7bbee3eb3e3f
766a2de6146d58e534f8d494664c4c8d3756799f
23822 F20101221_AAAPYY subramanian_v_Page_254.QC.jpg
c5ee345307438c8dd9e279745c564a58
02671cd1ede9c94f5e0029637045fa991129a890
6006 F20101221_AAARSL subramanian_v_Page_029thm.jpg
03e28416ed571b1b7d43915254852b4e
c20c300fb20d65e5bddfbdf06c6b3f15c88f9449
44525 F20101221_AAARFA subramanian_v_Page_215.pro
cdafe56c625cb938018c7c0d35f98440
73158935a2f94f5a4a0475ed50b59fa0b63c50ae
F20101221_AAAQVS subramanian_v_Page_130.tif
47e8937019f7ce643ff453d25323545d
dd5c99faf3f0f09d5893c28db40cf7656634aba7
69566 F20101221_AAAPLO subramanian_v_Page_305.jpg
d09fd56a1923f581bad6bfc72af7cb68
dfa3b7395b57185736e46e241d155df9f6ef37be
66826 F20101221_AAAQIH subramanian_v_Page_132.jpg
d532946ef313b15593c6349e8aa5d539
9f5256a657c73865af7380fb2bf243599759b30f
52137 F20101221_AAAPYZ subramanian_v_Page_269.jpg
192b977b07272728e203fe146a40bbfe
514ae51b3dc24eed33185e9b608652768e3a0a9d
34495 F20101221_AAARSM subramanian_v_Page_030.QC.jpg
398a400b149f9d62e30b408a5c880c9a
ffc73781d919729e2771cdad62eff59036cca1bb
19659 F20101221_AAARFB subramanian_v_Page_216.pro
1046cde424690251f016808958200978
e339d7ee73ec36e55b10b0e1a7e8f2e3104dec6c
F20101221_AAAQVT subramanian_v_Page_131.tif
a7b356f8d991153829423f554fc45782
5627be6dff6da8a28e9ca0aeda7f24be75efc1e3
36245 F20101221_AAAPLP subramanian_v_Page_045.QC.jpg
9e30fd28623e967d0466f7b58764cd20
71f98bee812f6ac146faeef72f142d4d8bcb3c41
63741 F20101221_AAAQII subramanian_v_Page_135.jpg
cd621f2c2f108198d6a0d9bc861f69fd
587d98411a5fa6e5da9e4ff2962bf12ce6daf9cd
8583 F20101221_AAARSN subramanian_v_Page_030thm.jpg
6768782b8da0d9dc344e93384cd0fab4
ecb61aa65133e82f4c8dab83ee56c18db1da94b4
23980 F20101221_AAARFC subramanian_v_Page_217.pro
4581b074e342e7b54f27ae7498c0cbfe
4dc55f5cc5336a8dae9fe4b4f5b86af316ab72a5
F20101221_AAAQVU subramanian_v_Page_132.tif
3e41ec49832523138210f5ead7fad79d
4721dc69c2bbd24906cfa38bc712b8abee9904b6
427342 F20101221_AAAPLQ subramanian_v_Page_285.jp2
d6d6cb58ba918f68418f17935f077ba5
88f0a41adb285398be7f7a480af65968f41207bb
88554 F20101221_AAAQIJ subramanian_v_Page_137.jpg
8acb7b480b67b7fecfad05306003c2c8
60b72d503473e70f9ddc8487ba6e568c86f2aeb9
7548 F20101221_AAARSO subramanian_v_Page_031thm.jpg
68780c554a4cb56466bc294a42f65517
201a6575973ce6683447e54c9935f099d1c8524d
52643 F20101221_AAARFD subramanian_v_Page_218.pro
9ae86211827a0006536ae6112965bb1f
cdb2a0d90e9d7782159556f3186cee13ec10a64c
F20101221_AAAQVV subramanian_v_Page_134.tif
632d6f6156187062595d54c4ef5c58ba
73d4a6044bfde3321dcb9989a17ea30c7eecd7e0
F20101221_AAAPLR subramanian_v_Page_207.jp2
31a0f748cb7a888d5f8e7cc9a90a5232
12388c66851c2e7a7e3ad946f0486f30c19a14aa
97504 F20101221_AAAQIK subramanian_v_Page_138.jpg
3f72c9bfe87b35c6a6d0bae82426bd51
e378c604d9ee53edbc41cb3ba0510ac715045324
29709 F20101221_AAARSP subramanian_v_Page_032.QC.jpg
562820d84ea7a97af92d32a2f3f0248b
53692c18966ede3fdf71b1cfd6e7bcdf553c5e48
57090 F20101221_AAARFE subramanian_v_Page_219.pro
6bd127c26fea546acbd113c1028ad8fd
2f67f08c4cf31943e9e95bb101b703df09d8571d
F20101221_AAAQVW subramanian_v_Page_135.tif
25d154cf87357b14cbb274413bccc794
0d5ee7f7736db258dc9e94e9847df724f4d6a424
1950 F20101221_AAAPLS subramanian_v_Page_044.txt
11310adde5e5f1c92ba325c1ac2cfbbf
fc4f56f10fb295153f07f36560fd8788c15b37f0
69482 F20101221_AAAQIL subramanian_v_Page_140.jpg
4502c779f6af5fce696f22170ea08ad4
5c504a2bfd722bde41a431f2baf624918aab86e7
38113 F20101221_AAARSQ subramanian_v_Page_033.QC.jpg
8220bc8e9267512b8d49a9412da7aed7
136a3e3d85d3300d3637242f4f2b77507b246497
24276 F20101221_AAARFF subramanian_v_Page_220.pro
cf2ac5aee8348358858b518a68c95fb7
01cad2a9e78b4007445c0409baaac6c979640856
F20101221_AAAQVX subramanian_v_Page_136.tif
04bfc5bf359ac82bf17bba515afc64d0
9af5ca5df79f921bca4a6185a0cd634a747994fb
64951 F20101221_AAAPLT subramanian_v_Page_251.jpg
55b8fdd7eae9ef0b69ff86ea37660934
ec4b8e45ee0b1ccd9b6859c5fad5a5ed1097ff2d
90940 F20101221_AAAQIM subramanian_v_Page_144.jpg
ab88fd27a48476c2b9ad86f0a286447b
6039926ab0f4d61bbfaea13f1eb51619466c6fe5
8954 F20101221_AAARSR subramanian_v_Page_033thm.jpg
ae6b2f304dd4f45d6e23ea64bb454145
4d4a47d54f4593f4f6fcad43310fb1ed0410413e
45237 F20101221_AAARFG subramanian_v_Page_221.pro
16ed420ef8103581dc7f411b999a5b15
5e8de59fb6bd813cc0e4325dd91c551c1996c83c
F20101221_AAAQVY subramanian_v_Page_137.tif
d8935c7fce4bdbb30a3ce5bac9b148d1
8c6e4352c4dc9097ccb6dd992c720ce0387a7e7b
2285 F20101221_AAAPLU subramanian_v_Page_161.txt
f6ec719de15f0f248b4883818db10a91
b4d1db119025ad680d9e58e964bc8212c5f30d3e
105380 F20101221_AAAQIN subramanian_v_Page_146.jpg
7b8153f141d76246e415af3ab5b6cf51
98082d20a416b90ea9e4133d1b4664a18b37fa2f
4378 F20101221_AAASCA subramanian_v_Page_279thm.jpg
d120cf6799d0ece6ed1c08fc51cc8b62
1a2ad206bcaea2ea03d766ea3d8f4499bc081f60
8441 F20101221_AAARSS subramanian_v_Page_034thm.jpg
599b90d5c63c9ce46dd6f40de4b6d6e6
436bdd25d1be3f08d4d134fb7676a7c71147c257
29471 F20101221_AAARFH subramanian_v_Page_223.pro
6b59aea20b49040705269df3cb838239
48f8d471697d21864ec2333ecd9a1d650c53649e
F20101221_AAAQVZ subramanian_v_Page_138.tif
fdaf4c3b9f387a79963a57509e8e22de
ae1b72964712ef8720f57a795d256fadc5e8da50
2116 F20101221_AAAPLV subramanian_v_Page_030.txt
efe60d449af724288c1ce8dcc94abb9c
f2f1af9819eabb80d5c2c85fb4a32cb3d6ba2576
63896 F20101221_AAAQIO subramanian_v_Page_150.jpg
472f4b7fc9a63a5b9874a744f077b15c
635d079d2a6396e8c4ebcd567ed3f7930b5916e0
16203 F20101221_AAASCB subramanian_v_Page_280.QC.jpg
cb53c77c68a1ef00f8cef3972e47ddd4
3da07e72a0d320ce321497e9c12a59a43ab89220
27380 F20101221_AAARST subramanian_v_Page_035.QC.jpg
11fb63c08bbb671b4e4bd8c48dd08f4d
d31e3f063f35a6f3ba53606c38b5c65e3763e6e5
21793 F20101221_AAARFI subramanian_v_Page_225.pro
79ff951cda54af00e1ee83b96642bd92
1393fe09642f04e2533d006463d15bad4f0d452b
881027 F20101221_AAAPLW subramanian_v_Page_209.jp2
648aa83d564c76af55116fe1da1a1b5d
6c95d318939b6f0cb561be67528c5a61352f04b0
98453 F20101221_AAAQIP subramanian_v_Page_152.jpg
9535f338f3072ffed1dd71dfaaa27270
20766bd80deec917ea8f0036e4e47ee872350a79
4678 F20101221_AAASCC subramanian_v_Page_280thm.jpg
1479e64eec28dcadfdb4b56d02a06ec3
cdffd227e05aacec2c08f5149e119734d1547e5e
7207 F20101221_AAARSU subramanian_v_Page_035thm.jpg
5fb827628f5fabfd1ce2674231ac4fe2
c8186e10afb611e60adef18d3235a241be40e51f
51473 F20101221_AAARFJ subramanian_v_Page_228.pro
cbca2763e21fb83a0e3b88290c78d69f
f4f528091213fe48786055024a125e745e79b25c
8381 F20101221_AAAPLX subramanian_v_Page_036thm.jpg
d2c91932bdc9802362db5fd316b61beb
776bd2e0e43c4fd3612ed21fdbdad9ba8305bebc
78580 F20101221_AAAQIQ subramanian_v_Page_153.jpg
819fe5dc2290b373e9017b468e9f9834
cf2ad5c433b748b83cdf7d330f3aed9df44a33f5
15392 F20101221_AAASCD subramanian_v_Page_281.QC.jpg
33e638a0e79f46c05604fce712733a72
6842737b3c30f4e546f0f236f0bdd3ab2b5f1d3e
31378 F20101221_AAARSV subramanian_v_Page_036.QC.jpg
2a9e17910df2f58eabba0288e4e0eee4
c3f79e996246a1f1d5e5710576ddac46e4508bf5
30137 F20101221_AAAPLY subramanian_v_Page_230.QC.jpg
fbfe23ab7fdad65a693941bf8587b867
1f19b6c061a3bdea2792d8f79aae2802f03bf83e
74438 F20101221_AAAQIR subramanian_v_Page_154.jpg
6e8a25db60c91082724423f238d3eaf4
ab1f572589402d667e2564ce4e1dcfe9f8ef838d
37014 F20101221_AAARFK subramanian_v_Page_229.pro
e1f590ca141994c84fa4dcb5d3d7c9d4
f3906f15f17d18ff0cb6a346cd7b6f1319bd6c3d
15148 F20101221_AAASCE subramanian_v_Page_283.QC.jpg
0ea83d2d7a1de555ea719ebe0548c995
5dacd64c7e3dc18e317a557c0c0371eb856c7691
9075 F20101221_AAARSW subramanian_v_Page_037thm.jpg
24f7cf2454460f76f72184fb1b507e41
da75ab42f53e5deaac7ddd733e12db903f6bbfa3
F20101221_AAAPLZ subramanian_v_Page_196.tif
f0c1325767ddd448477110135b132383
de0c458a61113d0b0f30a448fa106ea3600add72
81033 F20101221_AAAQIS subramanian_v_Page_155.jpg
ab82f54626d0bdc95083047422cfcf6a
6c04dbcca42f8bbac630fbfcab5f0aa2d3640a2b
31998 F20101221_AAARFL subramanian_v_Page_231.pro
250c65c6e69700a53de9c5f2e919ef39
754ee4bf723ca2c18cda09300473ec4c68322529
3950 F20101221_AAASCF subramanian_v_Page_283thm.jpg
3db51743eb5a4a2f5859c3e4b25cf23d
42a20395f71fc0165f2f40f11908b1e2debebc09
26427 F20101221_AAARSX subramanian_v_Page_038.QC.jpg
5f5a835d8aa9fd620202ed47e4079435
c2666afe3bb0746882d42b68bb79b897035fcdf9
66227 F20101221_AAAQIT subramanian_v_Page_156.jpg
e37b9019f14a45c73dba269e4fa0f21e
39e5f52ab96d7ece15bac6b51b73d7a0260af508
23248 F20101221_AAARFM subramanian_v_Page_233.pro
1e6ef074cac436cab631ad0c9d327352
8baed6eb301d75ba58d57c24480a1a207c42b202
13630 F20101221_AAASCG subramanian_v_Page_285.QC.jpg
7acd5e6c9db4f891424c0d3439f7b692
95974f2e4987dc56de9a2e961122a9ec850bf58a
6959 F20101221_AAARSY subramanian_v_Page_038thm.jpg
f850846198b1181af81ee52d12a236c5
e590e18a512fce9d848b79442d315184319a4686
33128 F20101221_AAAQIU subramanian_v_Page_158.jpg
649683b329b569992ba62ed18ad3026d
8dc9ce6a590664b299c54fb82f949b195d0eb1ea
52906 F20101221_AAARFN subramanian_v_Page_236.pro
911ce033486945f0300e4c9f17dc180e
5c8297f1264f006c92e7c6dbfba7653ff6cfc05b
3840 F20101221_AAASCH subramanian_v_Page_285thm.jpg
4c859541da086312c74f0c699356274e
7f9a82d1676b07b8ba1512dc924bddabdfb03acb
7765 F20101221_AAARSZ subramanian_v_Page_039thm.jpg
8ad21427cb2567bf7b826172d793199b
14603189ad4771968f35059f03dee4ee1c20be75
99123 F20101221_AAAQIV subramanian_v_Page_159.jpg
e4d38377ea05e1ea1f22ba0f003b40c7
235c38dc62d835fc27320e0fd299bf8527b84cbd
56704 F20101221_AAARFO subramanian_v_Page_237.pro
fb2301a8ba6d2eb22a76e3fa05812fc1
9fa6214ef4899ad770f4f07f27978a82b3b378ef
20962 F20101221_AAASCI subramanian_v_Page_286.QC.jpg
634f98b08d315f95c40b589ed7b388de
c5c6cbaafc373449961a063699dea0f05245ac64
117310 F20101221_AAAQIW subramanian_v_Page_163.jpg
95d49c0a79c8f1b285ade6dfc3ab8ba5
74aa0c304cd497978370d77fe2d125ddc88af77c
40892 F20101221_AAARFP subramanian_v_Page_238.pro
cd4557b17aed0217093228d6a06f611c
6c8e086efa2c1ace03389dae1c6a09fd2e1d5236
5701 F20101221_AAASCJ subramanian_v_Page_286thm.jpg
29cfa83342278546dcba6c06bba5e1f7
4c89414d0a5102ee64f41d64e4881d383caa1e64
92167 F20101221_AAAQIX subramanian_v_Page_166.jpg
79393ea173a8688335b9a5b66c36ab5d
57a519067241a4a0831c5fd237043835965afb1a
41103 F20101221_AAARFQ subramanian_v_Page_239.pro
4eedc831e823ebae1e2b1e49d811e944
e17aa987678c4c256b3cf1b116a1b6768a681172
18924 F20101221_AAASCK subramanian_v_Page_287.QC.jpg
cbf51f95544c1e6ba48d96df299d25f0
2209d02a35175a5c00989536334ba6531787b22c
105209 F20101221_AAAQIY subramanian_v_Page_167.jpg
b09d4550fb0f71d821e7a337df14441e
2a696fd101d9e1fe5ed9bd9a9cc2effd7d076c6b
42408 F20101221_AAARFR subramanian_v_Page_241.pro
9c1d2bfba956bffcae89cba70c09bb7a
afb7de4fe81de4059932d8d7f255a4b944d8ae4e
4903 F20101221_AAASCL subramanian_v_Page_287thm.jpg
61face4edd4f8ca2e3f4255dc5a98334
a1eb0b07791f37fcaab0097ebd329a94aed202d1
134219 F20101221_AAAQIZ subramanian_v_Page_168.jpg
c0cddb01c94168515a19c9a96c6e1875
a973af41c2f5a1374c0c5e239a13a94d5afffe6b
48534 F20101221_AAARFS subramanian_v_Page_242.pro
9be80d1d1b57c90d251fbd9388785d9e
55541319bbe769cc84154c86ae20755f2cca427f
4597 F20101221_AAASCM subramanian_v_Page_288thm.jpg
6ad29b8aea957c7e2ddecab968620a55
78a2ec35e1718411a620273f574817718c975311
46945 F20101221_AAARFT subramanian_v_Page_244.pro
e0baa78e642ec66b1a2c875ca2afced2
115b534364f5a02718a9dbed612ef1e5c57b5a69
F20101221_AAASCN subramanian_v_Page_289thm.jpg
8e6437d37364029c98bc413e15386bb9
abfc5a6f24524ff3f76b0583f6b495f1ce3ef837
8029 F20101221_AAAPRA subramanian_v_Page_196thm.jpg
38c666901c1ded32da108d66bf33510d
06b4f03e84861fcca083deec053dad2d35a3d8a5
42906 F20101221_AAARFU subramanian_v_Page_246.pro
9e02153cd99a93ff48511a1ac59316bd
185c6ecad8f081cbd3f941ae1949bc801dfb0652
12798 F20101221_AAASCO subramanian_v_Page_290.QC.jpg
0eda343fa575d52b630418519df97ef4
3d2ba7586e5d24d79aaa66b08307438e42d580c7
876321 F20101221_AAAPRB subramanian_v_Page_155.jp2
d41a5d0705373328ab6325f23b5be2dc
ea0a5f3c23ba1909509724e571b5f8a5c66f9707
50050 F20101221_AAARFV subramanian_v_Page_248.pro
8625c4c7b3c21ca6703c4e68ee724a99
db1bb6a36438d622b4eb20fde2aca8644b603ac5
5079 F20101221_AAASCP subramanian_v_Page_291thm.jpg
911a006111a2246cef2aa19088d9e55b
75593edc6593734fe79fb28dffea967944e6da0a
25475 F20101221_AAAPRC subramanian_v_Page_224.pro
9f1071b2356e696ca79f5544e3a454f8
e2c643377100fa765a0f2f478b3b37c4e3f5f30c
27597 F20101221_AAARFW subramanian_v_Page_249.pro
eeaa575fe0102f0809dd55f0a08dac5b
d62415015ef6d50bdaee03c1b1dd2b36e9954f43
13911 F20101221_AAASCQ subramanian_v_Page_293.QC.jpg
fd3a095077ab915f5f6592ed63014e80
511954b681c2127db60757638aee9bac7b4181bb
7675 F20101221_AAARYA subramanian_v_Page_174thm.jpg
a0a7eece5cef508efeea0c86b0747e02
793f9d8f6d4612aa639025d8ce5ba1f198e39f02
823858 F20101221_AAAPRD subramanian_v_Page_153.jp2
1f1e902611c0189c3d0d1606c930bfb3
3f4cfbcb8a3096be9d2e66b572170841fc96a761
29183 F20101221_AAARFX subramanian_v_Page_251.pro
9fc8b321936b0aafe5771aedc562a16d
cd4e7cac2e98cd5719455dc89c7bd022db650ed8
18083 F20101221_AAASCR subramanian_v_Page_297.QC.jpg
f3dd003e03ebc8530ae2e2c728babd29
8d68687a4a3746c212c96aa67dc28407a428e7fc
6486 F20101221_AAARYB subramanian_v_Page_175thm.jpg
37472d741521735ec3ead60b78e65620
8dfa184027c2374e29b094b5c335fa3595c10e63
110614 F20101221_AAAPRE subramanian_v_Page_218.jpg
05ab244fbc25f2eea1e5cb136f7cfd5e
41f8f4f22eec2d5879806dfc2d0d2d7a6808cebd
15173 F20101221_AAASCS subramanian_v_Page_298.QC.jpg
1b55229729a35ddc65f2e463cea64c18
0ba5bc324fffa892154604ef29fa8524004beea3
21466 F20101221_AAARYC subramanian_v_Page_178.QC.jpg
877044aecaf7e1317e15c8567e6a0b0e
4d0164ce3ee58f26f51fc11413b1abcf037e2060
5199 F20101221_AAAPRF subramanian_v_Page_082thm.jpg
42879a68ae87452a23e8c423ba8c5d68
cced1290e06744ecf0f26ab3fe4e1f6531d6bdd8
36778 F20101221_AAARFY subramanian_v_Page_252.pro
12500e7ca667314685e2a9d1ed3e5221
9ec2f3eb6ddbfe777eb176cf3159bbbdfc6193b5
18357 F20101221_AAASCT subramanian_v_Page_299.QC.jpg
ae89762392d68461df52187296a26934
ed9455f4a68f8811090ef117d3f216c9b8fcd1bc
9549 F20101221_AAARYD subramanian_v_Page_179thm.jpg
aba0c63c7a1100b8a23b2cd5b7b0a333
bc8f6376d700c461280a1deffedff0478b9a85c1
26389 F20101221_AAARFZ subramanian_v_Page_254.pro
e3ed95b8f04671d0ad5930862422936f
9b25dd3d9fa3889215f74b522b3fb46a730e4e3c
8995 F20101221_AAARYE subramanian_v_Page_180.QC.jpg
87b205ca20a171a21fb457a57c24432e
8531d952c3a1c7b0c590d32fc01c0d2d86dc0f72
8352 F20101221_AAAPRG subramanian_v_Page_236thm.jpg
6c6650789ce36083a25059d636862bb9
9c0f33d109f823107d02590df73a6fae4e9df550
699966 F20101221_AAAQOA subramanian_v_Page_089.jp2
f0e17e59b9b826aa8709027b33eec290
60139c873f16b28cb56583e1345d2bee5bc4c976
4745 F20101221_AAASCU subramanian_v_Page_299thm.jpg
1271677c4dd8f2c510f784e1b09d50ea
0f671d4cb7ef65a2f06a131d3b3d334c74af9e7c
2747 F20101221_AAARYF subramanian_v_Page_180thm.jpg
38297ba9481d505c5671ff2fccbfd793
69c9b40eff1a87bb18ee8608ec61826b3041112c
28725 F20101221_AAAPRH subramanian_v_Page_230.pro
ab7fee72e7bffb466f990550c05048f0
fcd0aba415e57b3e593d8941ce0a84d3749fa372
955371 F20101221_AAAQOB subramanian_v_Page_090.jp2
b531957a09e96360f6f22a7fa8cdba6a
ce25f0255c7b3c3d23a4fdf64674c56ba9b7d47f
3970 F20101221_AAASCV subramanian_v_Page_301thm.jpg
339d53c7aea5b15075ffcfe385c9aee6
53a4cbd28756806cc85abc65d177830259e541df
28996 F20101221_AAARYG subramanian_v_Page_181.QC.jpg
a4714f3feddaee5e220c0a024820f3a2
45b1d7c08b61265d1a79e0bfa471cd282bb5ed70
994 F20101221_AAAPRI subramanian_v_Page_140.txt
2f49f3c25d8b0d95db64290c02a9b9f3
12f751aca4e700c5b96906133e26d35c1a2dc0c8
16554 F20101221_AAASCW subramanian_v_Page_302.QC.jpg
91445ad186286179b4393aa9b8d4d5c1
8916c606df42332f3f40d77cfcb5f17e775392b0
F20101221_AAARYH subramanian_v_Page_181thm.jpg
4adaf94d99cb73cabed00bd24dd72518
3810697bd90d0d6a8f90f135d1f22653e04c12c3
F20101221_AAAPRJ subramanian_v_Page_231.tif
d6a54eacab74b29888e3230fe2c595c6
35fe94349172013fe84de3ff846bb2d04c25f200
585995 F20101221_AAAQOC subramanian_v_Page_091.jp2
3d1d6ded209e01e44a5ef3389d2acd51
ddf4d0d4e75faf91be7f1a833140b180022c9559
11694 F20101221_AAASCX subramanian_v_Page_303.QC.jpg
45c20bc574e848f767bbb7e04eb81072
3ab3c1f6f8c5d508cd2070c39586c3ef3cafdd0f
5955 F20101221_AAARYI subramanian_v_Page_183thm.jpg
48d6a97cab03ca31f5fb659f87741f2b
60a512d7800be28222f05f3fb62cef00ef59d830
28744 F20101221_AAAPRK subramanian_v_Page_061.pro
5964cf136bfc17116815c6ad3712c84e
021a24b8fb52595a213c32ff7ef88040e060b8e8
1051961 F20101221_AAAQOD subramanian_v_Page_095.jp2
bfd062d5d9685304c7e65fb75512aa76
00eff86ae4d5ec121d50a651488234402acfca6b
3247 F20101221_AAASCY subramanian_v_Page_303thm.jpg
e0e9f170a2c0c1437a5491f505ca7c78
219ade4f5735d4c26717eadf5f8ae9282c594796
8838 F20101221_AAARYJ subramanian_v_Page_184thm.jpg
e0f3d4f85ede6fac52de5d52f1a4e7f9
75a7bad3915e0a31e45c5a31aeb18648bd00b4c1
16704 F20101221_AAAPRL subramanian_v_Page_296.QC.jpg
8fff362a251b6d62cd9a0db9fc82dd96
fdb060f3485a645f0e88a8bff5cc1f49558daf78
1051921 F20101221_AAAQOE subramanian_v_Page_096.jp2
99295be1f0cb4166bd0aad0d85a5d511
2d200d62b896e5d21ea2f69fd6745b4a1bbe3f0a
7786 F20101221_AAAPEA subramanian_v_Page_166thm.jpg
c240c6e581f43d9ca8fd65d156f46f60
3410269b19a98a9bc68b38b92331a7dab2201338
16908 F20101221_AAASCZ subramanian_v_Page_304.QC.jpg
44ad1fa3723ce56954766f56d075737f
4bf2c1b3cc59327f9d1afcbcf2d2719849aab232
31605 F20101221_AAARYK subramanian_v_Page_185.QC.jpg
f1a50bc1aaa498bda7f2b0f9c76a9f6d
4b1bba4a464e21ed34cdcd9cbe249bbd6e8f78a3
33486 F20101221_AAAPRM subramanian_v_Page_020.QC.jpg
b28455885d3a2ab9502a1879975fd580
6a7099a033639bcfc2dc28a533f6d1695b507a35
F20101221_AAAQOF subramanian_v_Page_097.jp2
122f9e42e047a4019127718314004fcf
e0360c1be99f6b8d16124150581d5fa56db70faf
1854 F20101221_AAAPEB subramanian_v_Page_174.txt
24f48dd80ab05828983a56338148bafb
90fd56702f5a8e6747850bd478c7318b9f0168ce
6985 F20101221_AAAPRN subramanian_v_Page_233thm.jpg
9299b73337f3312e19492b03ec8d222f
91f3772e4e080f8c1fe54e273ba171ce50cebe67
923650 F20101221_AAAQOG subramanian_v_Page_098.jp2
53594c9a43de8bfda5c59cc48fcc7ace
aa82cfbe845fcf04ff3ddb0ce574827ef1a4e5de
111128 F20101221_AAAPEC subramanian_v_Page_268.jpg
a7f3ecefba10a5c61fbd79cd494427e0
a9c631e3ecf1b250c5a5b1d66a06e2e275789790
8234 F20101221_AAARYL subramanian_v_Page_185thm.jpg
054e995d77dde8bd2a4125ffae4ff494
03901f47a57a3f747fe7e126a288a9bd8f602fd3
1967 F20101221_AAARLA subramanian_v_Page_152.txt
cfacd06405d480108b020d1d8c0522ed
5c104630866c0eb60dbfd2af89b5cfa4107640b3
F20101221_AAAPRO subramanian_v_Page_195.tif
850f1e4fbdcd4ebe1f5758b5dbc8d026
dd8955f58a0e49519b1a4507646bde595919b41d
881070 F20101221_AAAQOH subramanian_v_Page_099.jp2
ae1a2486d319ddbe8bef11eeacd59586
965eebcb604f9396a4ed028d3869b0ce99616217
F20101221_AAAPED subramanian_v_Page_225.tif
27b91164a5ede7a5b151ce08599bf4ae
80d5cbe392c29b3e82b23dc83995b240602dfdcb
31308 F20101221_AAARYM subramanian_v_Page_186.QC.jpg
64ab20ee6275f8caec7285517f742847
9e7ffc3029e73fb70fd9ba0173fe80eb3dac43eb
1556 F20101221_AAARLB subramanian_v_Page_154.txt
b90a326b9cb739487ca9972d6d9e6636
389031e1055b332007c090db5873adbe93d54410
22557 F20101221_AAAPRP subramanian_v_Page_085.pro
4a6f89f2bbc31a9d09381473c627303f
5e1de48beef899df83f54d9f0a3432fcb893e786
1051900 F20101221_AAAQOI subramanian_v_Page_103.jp2
110fea9af7b4a1d0399128f873a1b8b6
a6e2fd1182b0853d02e110c191d948b6ff492251
8692 F20101221_AAAPEE subramanian_v_Page_071thm.jpg
28a31017cd8ba92290be97773bbf7c4f
2b6e0ae4093dd38c4a34a4b24a86263d6c112c31
7995 F20101221_AAARYN subramanian_v_Page_186thm.jpg
7114417ec91cebacccc9af9e34f908df
b4e10cf5172e388d03d82061898207aa449eda2e
1148 F20101221_AAARLC subramanian_v_Page_155.txt
595275ad8e66771375e29511d7df177d
bf3ae88a7f331a77110142300fef1b04d894954e
33250 F20101221_AAAPRQ subramanian_v_Page_040.pro
06c8cdae0556b29ef714efba5d803856
99ba2d28ed3b584590e194c18e2449086acdd4bb
1021383 F20101221_AAAQOJ subramanian_v_Page_105.jp2
37b6cbdcd26b20ca00e3ba83258125d9
d26f240f53148ce591b381f2a671676aad0525d2
23983 F20101221_AAAPEF subramanian_v_Page_098.QC.jpg
90ca0514bb0369518f101693af15e3ec
3481eba243bb5dfa663bede069ee1054ba8e5200
25887 F20101221_AAARYO subramanian_v_Page_187.QC.jpg
1dae925127d4a8b41b27a9a08db6cddd
f853ae147be8a102330a555bc7adee0ea04dfb19
1544 F20101221_AAARLD subramanian_v_Page_157.txt
f0f9af4c753fd4324b0763e91dcf20fb
9db25c1aeea1b6822cdd1adc41e87a9ae59d34ff
F20101221_AAAPRR subramanian_v_Page_301.tif
22332bcd2f65c935abd3b05045b89ae6
2636f08ae0fcc2b49e41d2acaab97c5ea54d4578
880367 F20101221_AAAQOK subramanian_v_Page_110.jp2
9a8e35b37ae19aa4bf0db298d366d9f6
1e79a5c27ae984b5742ec8636513a2ec6eaaf08c
95181 F20101221_AAAPEG subramanian_v_Page_139.jpg
cee5894050d42e38f3b9ef4041eccd0d
896dcd08e41a23a471bf95bb6dfa9bd5d9c1eb53
7321 F20101221_AAARYP subramanian_v_Page_187thm.jpg
291a41dd0c19446480b0d209f2a11625
05a92e6d5f43499e1b032684f25bf223451aa7dc
157 F20101221_AAARLE subramanian_v_Page_158.txt
d3530890dac12ce94d2b507e02c5bf41
74b22aa7493ad7334883727dc2c41c642e504cdd
88769 F20101221_AAAQBA subramanian_v_Page_101.jpg
cdbccc0d8a446a9eb349910bcc446e55
d5853c2aed85fb037acf0f89d4ca8093bd85b635
1413 F20101221_AAAPRS subramanian_v_Page_097.txt
ceb7588698d7a793fdf96aae88594e3c
a873fec5e39e75752d3f4c2be62137ceff4ab6d6
1033398 F20101221_AAAQOL subramanian_v_Page_113.jp2
6073cd7ef84d3311195d528ee9203940
e0435a8f2bf5933dcae5e52d2d8c5cbdc1409af4
113974 F20101221_AAAPEH subramanian_v_Page_162.jpg
165c6c6ea219986ddfbb8f24cf4355ac
175dfd635b0a8cb6111b972458d1e7ac82411a2a
28299 F20101221_AAARYQ subramanian_v_Page_188.QC.jpg
064dcb83837654365fc708a3bbfef201
e60d1a5eaa011892b342ded08a97e1f607f69d05
2348 F20101221_AAARLF subramanian_v_Page_163.txt
72025eb663801976ff22e13f305e92a1
c15f1c81c428563824072fbc1b0c9f5fbdc987dc
20576 F20101221_AAAQBB subramanian_v_Page_130.QC.jpg
54df7707cad0aab62cd4961652d311fc
d13be516f144b0db9d5ca840a669fdacadf514bd
27451 F20101221_AAAPRT subramanian_v_Page_200.QC.jpg
b3f8841a3e7b22ee8b06d190715abc2e
4534db3f1acd27a3d33592053f645060b39f0de6
F20101221_AAAQOM subramanian_v_Page_114.jp2
2f37f27a95b989274b0ac22254df78a8
841bff74200ccd07b8145e6f2b7fc8a631075ed0
33570 F20101221_AAAPEI subramanian_v_Page_167.QC.jpg
1f326f2b1107b91dbb6a68d0fa61005c
18190fa7edd7f67e5a6c13d1d572f835256f65e7
33663 F20101221_AAARYR subramanian_v_Page_189.QC.jpg
9a9ef82ad114b1a61d7a8c29e0a37405
8e0293dd4d9aacc78475483486fa87d044a4edeb
2561 F20101221_AAARLG subramanian_v_Page_164.txt
f7c2902495b23c20cb85bdd33c228f3f
4d2d34c7703b0ad6d24ab69a259b2eef6f64ea68
F20101221_AAAQBC subramanian_v_Page_047.tif
9fc3e13efeb71a15b0f8818aabbef5be
f08e38b54244297d1440be763e5617861358a4cc
58738 F20101221_AAAPRU subramanian_v_Page_162.pro
89ce58320517bc34d46c2bf8395de67f
e85b686ab0b92e1b02d5cefc8d3aa0c9542537ba
F20101221_AAAQON subramanian_v_Page_115.jp2
626b1e1ede3db632114ec63abcc88a7d
22dbb94db68b89fbbf8df3e980c93193958132ba
23890 F20101221_AAAPEJ subramanian_v_Page_269.pro
fc481cb58416a36ebb003d237786f587
667acf59e0d9427bf9ee2d8271ea3eafe25eab96
33079 F20101221_AAARYS subramanian_v_Page_191.QC.jpg
231e12af11de3aa45928df04374a5a9e
6e9ae107d089b66bf3cf22aa55a4bc6b98ce45b4
1915 F20101221_AAARLH subramanian_v_Page_166.txt
0d1f076b08bd41651e5cd92bcfd06018
b98e836b4f5a2153df628dbfb33c613fe5bb6f03
55069 F20101221_AAAQBD subramanian_v_Page_277.jpg
02367429153849ac141abf30450f9bbb
fce9b24d013ee74d08a70ebf78342ede629ea9a5
4498 F20101221_AAAPRV subramanian_v_Page_296thm.jpg
87e9502ead99d4a93c6b70eddaa6493c
803ed17869c621b739fcf382deccfe62e675a6c4
968046 F20101221_AAAQOO subramanian_v_Page_118.jp2
334d6d6fff17b7eb3596cac917abdffa
fae8875052473540d6a3ce93ae22fb4865b5a881
22721 F20101221_AAAPEK subramanian_v_Page_198.QC.jpg
578eefa6cc6eab6875555b6143334519
55bf544a6a8346e81c08ec9b01fc7157c244d95d
37485 F20101221_AAARYT subramanian_v_Page_192.QC.jpg
a960c8660ff115bfb25da937841896ec
d53bc51b0d811547142ecc1503358a4bc23055ad
F20101221_AAARLI subramanian_v_Page_169.txt
6e0e13a568465fb28c764127b7296e66
db9d8dd2b90a5b428bdc7724a39565abfebda038
548 F20101221_AAAQBE subramanian_v_Page_057.txt
5e3dd6d9ac8f0fe67a607d1bae41cc4c
fdabeb34852f86ae0deeca26d69579376462c5cf
F20101221_AAAPRW subramanian_v_Page_128.tif
63720f4924bac4be4a5ca7032f5f40c7
1328efda7b2657ae1ae4910f3711a082a73625f9
1017581 F20101221_AAAQOP subramanian_v_Page_120.jp2
59067c53a8316b94277975c284a49ffa
f8d93eb85f862bc719de0ef87435c6831a5b26e1
36262 F20101221_AAAPEL subramanian_v_Page_268.QC.jpg
f532037fc512ba80cab8df39177d7b36
35d0b0d528b2a8886e4237610c921b125a5f4d92
35647 F20101221_AAARYU subramanian_v_Page_193.QC.jpg
65d2c4237a4fceefa7983d489dd10a45
e9813e55c5f73465acfb00694303cb7cf53d4c70
2310 F20101221_AAARLJ subramanian_v_Page_170.txt
34dff13944622d83194d8a55e9947592
ec117999eead31d66d33975131ae2248be0d97b8
8993 F20101221_AAAQBF subramanian_v_Page_147thm.jpg
c3cdddc633f0d6ad6b3c18325451d648
11f66fefd8588997897690be0715f9c42d2284a6
991266 F20101221_AAAPRX subramanian_v_Page_086.jp2
daabe6772952b67e8db6cffaecf8280d
c3a7b0cb598ae70a11907fdaddb545bfe87c2fd2
821799 F20101221_AAAQOQ subramanian_v_Page_121.jp2
093fe63d22497170f2db69aca4e491be
41f0f381d22968306581729b5452c4d53f02e494
F20101221_AAAPEM subramanian_v_Page_121.tif
acec247c8bd7f7d8e1dbf818104439fd
09f8928d04646196e50fd9d006168856f195a973
8704 F20101221_AAARYV subramanian_v_Page_193thm.jpg
c53bd68ad620429ee8fdde44b8fa1d1d
bc3bb6a7ca8171a3264926a86c33c0bb2ebf498a
2206 F20101221_AAARLK subramanian_v_Page_172.txt
1a3061230ae209b80a1f655131fb7e74
050aa9e608e38a59dd43f423e7f9494d5ec30576
8253 F20101221_AAAQBG subramanian_v_Page_015thm.jpg
a3222142dfb1cdff36f74f8981a90f61
0473ba3ba58c92263b6b58bf69ecf78cfb493bf1
26708 F20101221_AAAPRY subramanian_v_Page_088.pro
5ba97c94c868e44f3a14fbc2b3531a14
2c37095fa95a09fb557412af1d4261cf1fe1257a
F20101221_AAAQOR subramanian_v_Page_122.jp2
7ec028e0e4fe418d57fa75c83683884c
10617441189ae2cfa532cabb116ad1095697a8a5
123452 F20101221_AAAPEN subramanian_v_Page_014.jpg
de25e3c5fa978b2a93d4f3a05bd4df7c
14feb3315602fb2f503d7e8cf290d8bb6e226264
8834 F20101221_AAARYW subramanian_v_Page_194thm.jpg
7ddbf38545e4d68b5eb54c26f1da6d82
7b8bd70d5c1767356ff72653c5465bfa584bb23e
2029 F20101221_AAARLL subramanian_v_Page_173.txt
185ecd1360226c638be2d7aaa3d3e6de
ec9353f94b5446d10d65f33a8e2165077afd90ae
31088 F20101221_AAAQBH subramanian_v_Page_089.pro
2476f29d1e40d10480f4226cb01f0f55
067ebec1b8a10b9bf39a7a08650fb7d2632dd353
F20101221_AAAPRZ subramanian_v_Page_055.tif
50d180179a42e5b181c091d139f7af20
fb968c72b2b6fc6c6838fbeef9d898237dd1f3d0
F20101221_AAAQOS subramanian_v_Page_123.jp2
78049dbc9467fa9a523ec67594b284fb
880a6f9b17ad63923f19ce35716219f0619dcecd
34801 F20101221_AAAPEO subramanian_v_Page_071.QC.jpg
2e31e3380828606f5f43b055a36f6801
7ebb1c07f120b3d484d9920724905eeb13015bc7
9228 F20101221_AAARYX subramanian_v_Page_195thm.jpg
48a78c65c8cfc340609d1e016b3b0ff8
04bed6706e29cec4e59646c6a5b66795a071b8ff
978 F20101221_AAARLM subramanian_v_Page_176.txt
3b7bca66c62afa1343fd38dfb05c409c
639c627bbb08d32243b5db661b0d6c136b328cf7
2033 F20101221_AAAQBI subramanian_v_Page_265.txt
a8aa2c1551782debfa7666154cafd207
ed30acf988b6269f0a177bd7b327c48df47b7540
865526 F20101221_AAAQOT subramanian_v_Page_124.jp2
d6f6c99ace0616e5e9b7b8ebd9e03b50
9199f8cee35c4350cb70cf02fb10161a6c433f9e
23628 F20101221_AAAPEP subramanian_v_Page_004.QC.jpg
98c9f6e1dd04b29de78db13a0895e745
4f6ca6dd08bfbdd8932a70b1c7e8c0c962aacd30
32265 F20101221_AAARYY subramanian_v_Page_196.QC.jpg
3838d4cccb2816bfc59590fe219a035e
48ff2e51c061896ca2e52e8cc7955b0231deb0cd
1488 F20101221_AAARLN subramanian_v_Page_177.txt
8a24e018c313c8ddb306e623f8cd6b5a
ecc30580a23461077718eb1da38df1e994c67e2d
670032 F20101221_AAAQBJ subramanian_v_Page_276.jp2
9b7caab83f950f3dce51380cf04e29b0
1114258a1fa67204ea99e80219dac8e87ba54981
1006146 F20101221_AAAQOU subramanian_v_Page_126.jp2
001045fcdaedec729e6f1d8dbd822b38
e33617d970d138330da5a30dda7eec0bd3522025
97331 F20101221_AAAPEQ subramanian_v_Page_171.jpg
950fe1e441656a231261ac8e3777258f
aa41fded7b4485409607348f05c16f9254b378a4
8456 F20101221_AAARYZ subramanian_v_Page_197thm.jpg
492882366c980dc3e89a7ca9911e28f0
6ce82add6c020bf349f73799c665b62e93218f00
668 F20101221_AAARLO subramanian_v_Page_178.txt
103bedf4fe499d4a0af797ee054af2e5
ecd04c45cc5e967e7f7b0570adbf7cd7e7230886
19959 F20101221_AAAQBK subramanian_v_Page_076.pro
74693187ecfba5f25132827211d1e63a
322583a453ab7848a76f13197c1165490b1e0bed
F20101221_AAAQOV subramanian_v_Page_127.jp2
1caefff36c7a308ae2b856020584e1b0
f32bb24637f67496ce6479d1550d6994814736ce
78789 F20101221_AAAPER subramanian_v_Page_157.jpg
6738e8a5fe1d9480c6d6ad896b90ff0a
329cff2a8c070e0ce09a78f479def9b9bbe8a335
1825 F20101221_AAARLP subramanian_v_Page_179.txt
958d16533145edc606fec6c07c137a3a
7f70bd1d2429c949b972e6a61417fcfe82626d88
6929 F20101221_AAAQBL subramanian_v_Page_132thm.jpg
6617c6fdb20258515583966d67fa8318
e6cc852e11caffbe057c76425c182d18858f1166
F20101221_AAAQOW subramanian_v_Page_128.jp2
2039f5132c836f3f685d50669013e9d2
a20f7c1916bf1dd443b64d14cd7716ceaf306d51
F20101221_AAAPES subramanian_v_Page_235.tif
798e2f1733801032a16d24021cc6be4f
3e6d61a29f940c1dadaca5847a2b4816b13053b4
3176 F20101221_AAARLQ subramanian_v_Page_182.txt
b0667346b7e01246714b142548a55212
f4b81119bff083d7fe224ad68dee83ec32ae59a4
33530 F20101221_AAAQBM subramanian_v_Page_097.pro
634a9cf9ad980967c7d85a10c9d68491
2e11b818f421a393823e924322c25652d08d5a09
1051938 F20101221_AAAQOX subramanian_v_Page_129.jp2
09abba261a71b194a6f620151b22b32c
846ab6406a54f663ef6296a03269b22fc8e19256
8112 F20101221_AAAPET subramanian_v_Page_134thm.jpg
77d71593cd862cce066ef9dc8f93bae7
4a4004c0ab9267f5de0eb70a273eedca2eb8dbd3
1065 F20101221_AAARLR subramanian_v_Page_183.txt
2984a0b522ee50ed12b91c000cb59a22
e898fb457b3c33821bb3e8cdc687355a06ccfa1c
F20101221_AAAQBN subramanian_v_Page_220.tif
3cfe18fad25e9f6c69214f5aea7afd57
768e09ed842368a099199240ece46942f7ba24de
622259 F20101221_AAAQOY subramanian_v_Page_132.jp2
53f29478112f71082ddec2a623bc477b
abb209586d0321b279845298c884fdc0434c41a4
2468 F20101221_AAAPEU subramanian_v_Page_165.txt
b6a459452bc5a476646b586d9a23da53
91da992e31b7349cc8d16c04129f5c2dd2e84d8c
1599 F20101221_AAARLS subramanian_v_Page_184.txt
52b38b5a5a385e6cc6fb97c2197d4b3a
7341531bb4bf025019eb5299e5eea6ef6c387122
F20101221_AAAQBO subramanian_v_Page_098.tif
c96462a5c74b456ab1d87c7132e56937
19d5f62f9155d4d84260226896f141eb53a70c91
702666 F20101221_AAAQOZ subramanian_v_Page_133.jp2
46065cc92c210abf2f4e4c9b5de0f5c5
3890c6dc1ed5e34347c8d16bf863a55e08124894
1042196 F20101221_AAAPEV subramanian_v_Page_234.jp2
2f4160b937f6e455a6b7ad6bf2b22ee0
e5b78c4c5abcefe1b4b16ef14c30e8913a3a13a3
1547 F20101221_AAARLT subramanian_v_Page_187.txt
be409322abd07b776e83d69aaf3bc76e
e2389bd850bbc9e3284faaa4f29a0df9a8cc0b77
F20101221_AAAQBP subramanian_v_Page_013.jp2
748df42f90a0842f2893abe8b2ecfcbe
b1ba2007e8b051539b1a43e0c446a39935149d04
6839 F20101221_AAAPEW subramanian_v_Page_057.pro
901aaa2c5160ca9e70bff205c6919354
1c4acdce4aa8bfecef095a810959b1fc18298368
1970 F20101221_AAARLU subramanian_v_Page_189.txt
7d007d81889a93fe712fa3c63c76cdb5
954bf53ebb230bc1c49c852de5f8b5a3deda145e
103104 F20101221_AAAPXA subramanian_v_Page_247.jpg
8568b225de58995996ab1f6b19aba082
e164fdf1da9b96e4a3d38b4ae34a84013f7fd959
83298 F20101221_AAAQBQ subramanian_v_Page_099.jpg
ed1fa2e2e42b0f16fd6617dd35d05055
310e10d40f9bdcf24625e6a0005a351d06c5de3f
8229 F20101221_AAAPEX subramanian_v_Page_116thm.jpg
e237cc3e25178e950a2676f47fc54054
30c4f95ecad869dcf9537b8415e5657d7ec9bc6f
2011 F20101221_AAARLV subramanian_v_Page_191.txt
9cd32dd50d88776a4d579a2626f36e2a
5ace540679b58ab2063f32d56ec8460ada6a5e40
1253 F20101221_AAAPXB subramanian_v_Page_175.txt
185f7d3a752637dc299a6a1bd0236f16
43884aa2cbc351216d0880fdcf55c388455546ce
19218 F20101221_AAAQBR subramanian_v_Page_276.QC.jpg
df978557be10bea2930d7f037a5c9af0
8b04ed1e66b8345e260919d6bf7bcbd46b1f4c24
912601 F20101221_AAAPEY subramanian_v_Page_053.jp2
522772455a1b405d7ac334e7a64289af
759a03e0e5efe333d268bedd528c79fe9cbd4fea
2271 F20101221_AAARLW subramanian_v_Page_192.txt
baa4f9cf4e8de5b23c474e4512823b77
1f831b1b8e54e51df67a4c3c873a3b60d3e8ec97
33538 F20101221_AAAQBS subramanian_v_Page_248.QC.jpg
95c55e9548bd2509fdec33d137e007ea
ed6c397c6bebfb6c441b737bf2b0f6c410bdd846
41636 F20101221_AAAPXC subramanian_v_Page_087.pro
03d18da6fc39638bf8bf07763870aa8b
0e823ef59b09b858069fe428d8a7b892057b5e5f
2138 F20101221_AAARLX subramanian_v_Page_193.txt
f650181be0578a29f38b7e3148f6f43e
c736d5b7df07e533de2a7f48d23464199169ecb3
318 F20101221_AAAQBT subramanian_v_Page_275.txt
9f2b08ace504bcf808253a1d1970176c
89b63a95cb66012f471be4a79c0c1d15d043b1d8
6672 F20101221_AAAPEZ subramanian_v_Page_157thm.jpg
c46f41b7711931cb3512a29e894f9cf3
83b09e62b5b3bd168f2fe9bdf4b6496d194edd25
1921 F20101221_AAAPXD subramanian_v_Page_171.txt
1d56dae792113bd391079bf88c98381e
b1b76ae720056294a6947ddf6fa172dd1f91c723
2142 F20101221_AAARLY subramanian_v_Page_194.txt
64294185bd860fcd3ddeee04eb42328e
31a758327c9ca04f619af4fdbbd6d211b69988ba
F20101221_AAAPXE subramanian_v_Page_210.tif
19b19124ce442e4c6c5b818ea0ecbb94
0f72cf49e33e536ff31d6c5a7c702a6a38d0747e
1076 F20101221_AAAQBU subramanian_v_Page_069.txt
8ae88f0c10bc45152cdebd7a7454aa83
8d3a8944e7d595d1774104d9f2546feb52457341
2264 F20101221_AAARLZ subramanian_v_Page_195.txt
9ad7319b1456c9fb604780dc634331b7
b60b0b77d9ac16ea6f9591fab9b28558a57f7912
67092 F20101221_AAAPXF subramanian_v_Page_254.jpg
9bfca60b603ab0dc1375bdc583adeda9
b8486836d745069517450fdc70848754f2e64d89
53057 F20101221_AAAPXG subramanian_v_Page_280.jpg
3832664aaac4ba3dc43dbb4a5f3d7aab
fbe3bbc795d7635845009265f240c3b0efb8a306
F20101221_AAAQBV subramanian_v_Page_206.tif
9e9c97589862a7c257dc9d077fd19f71
c177ec7e65cdf1f0f632ae79e75008f748cea019
F20101221_AAAQUA subramanian_v_Page_048.tif
37da60c2f71c5b50c016aad588adab4f
828f2e8d2cacddeb6abb9bd0f6ef9fa6db7066e3
75265 F20101221_AAAPXH subramanian_v_Page_187.jpg
6370231d2e2cf38f13081c977969d67c
4ea4e29debeb82c17c771f5a467dbe63e9c7a31c
F20101221_AAAQBW subramanian_v_Page_248.tif
13ec26795bdc4a7dd9af8694fb1baf40
b09e3dcb47f1329841e199ef24d717b259484a8e
F20101221_AAAQUB subramanian_v_Page_050.tif
860c3089f84de46a84cf166351b77d3e
06af52617f549d8dc7b979169b015bc001fa97d8
6941 F20101221_AAAPXI subramanian_v_Page_121thm.jpg
4c29a137d32ff1fd36953c0b9dfe0c4d
39458ddb67cd6e5037b78ceb7de8c368072a22d9
114911 F20101221_AAAQBX subramanian_v_Page_011.jpg
e32e77600d893fe85a44e8b127dba759
dc632b1e0f509bcc9e543b103c8ba282ecad7c4a
F20101221_AAAQUC subramanian_v_Page_051.tif
ee30ce822ab85c6006bed76254727b21
1c56bfe352c91556218039040bf1a2c164f4e6ca
81449 F20101221_AAAPXJ subramanian_v_Page_038.jpg
760a3e0a9cd6642b0371cc7c248d768f
9fdc6f9be16cb59bcc8aebef7061eb5c0c49587b
682840 F20101221_AAAQBY subramanian_v_Page_156.jp2
becfbf20b2392f862a57a93c66bd69ef
f0ec7031a32da625ecd663cfe2dbdd46fea4f0a6
F20101221_AAAQUD subramanian_v_Page_052.tif
3552b44a15a83f28bb4345897e226974
3d8904c537a68890218bf608404b650fea5261a7
82439 F20101221_AAAPXK subramanian_v_Page_227.jpg
63a7f1dbd5233d9117251fc2a91aadd2
7dd542c8a9a77b1114029ff931be4de6937be450
F20101221_AAAQBZ subramanian_v_Page_005.tif
ab8872cdcf857a2fb7139eb08f273538
edcffc32f23c67b450f006e2092756b266652ef0
F20101221_AAAQUE subramanian_v_Page_054.tif
a4d310c5b0298fccb0d3a3ebbf446f51
a135dcabdcd9b7c8cd7f52963fd4107a144d8f01
103057 F20101221_AAAPXL subramanian_v_Page_042.jpg
152d3a1cda4c59eab42df4872d49eaf2
36d6f4fe12f605ddebd604447e3e1795bd4e40ce
F20101221_AAAQUF subramanian_v_Page_060.tif
f8dce7dda1e7c88885a883b0658ec1c7
cd055d98aac441dea84e43f08e5086ae983fe8e5
121160 F20101221_AAAPKA subramanian_v_Page_259.jpg
7c1592b00a23deb89030c8395e255990
9796a2f1a5b2038c5bd8fd0237cf4a9de0e23317
F20101221_AAAQUG subramanian_v_Page_063.tif
0f87a23e30340adbbf7180d32b239814
6827801326d8fbe39a761c2b6c130ec406b03619
8372 F20101221_AAAPKB subramanian_v_Page_127thm.jpg
1a033b0731e31429dda32446ceb8d70a
3ed11b52f3fa8b8b36fb36f5187968ea277f0805
F20101221_AAAPXM subramanian_v_Page_103.QC.jpg
196a10d366e6b5a93f9d82deb45a5a20
47d030aa3dc5c7d1c2f903b6314ee263e79d4df3
8199 F20101221_AAARRA subramanian_v_Page_152thm.jpg
356da2cc0bca070f7a4bb7caa230460a
08bf68c51238018c9c4445fa6e1cb93b678f10c8
F20101221_AAAQUH subramanian_v_Page_065.tif
49d1f7d38b21c3291bf890cd301a5fb5
605111fd396fc120333360dbd7fd15006d9a6692
48129 F20101221_AAAPKC subramanian_v_Page_159.pro
7ca3e70cde5d3a3613b526c512949484
a3f0550d03b0f113976b7015fa62ccdd6518b200
30001 F20101221_AAAPXN subramanian_v_Page_117.QC.jpg
f4feee3da2a8ad14689a0e3c59509bb6
a5e02b6879bcc19ee459b8bf987ce348e4b81a75
29522 F20101221_AAARRB subramanian_v_Page_143.QC.jpg
c5c553af49d7060fb19aeecc0b324a10
f92b307903743e58de658477de8adce9dfd0c157
4717 F20101221_AAAPKD subramanian_v_Page_297thm.jpg
1a76144bb4f9b7374eec5592d5d4aafb
7aa19e34235159ed51a95e78e1e792d0404532da
13941 F20101221_AAAPXO subramanian_v_Page_178.pro
3657d19ec4ef6f91fdb781cd175a3e6f
38c0929f7be3f3da2242a9fb8384ad4e534d2398
8154 F20101221_AAARRC subramanian_v_Page_230thm.jpg
41fb06b076456b8ac5fb1c486666d00a
08b120fd5e30b8edd09a8f0d3dbca18edd487c40
F20101221_AAAQUI subramanian_v_Page_066.tif
2b8cc2220d0c1637548e8263261e00fc
d179f83983280bb201e94dc0e946a17c0109084d
F20101221_AAAPKE subramanian_v_Page_051.txt
cbcc3bf3f09b16117e93682848becaf1
a0f22f50e69699065364022a23b4c4850317d176
62655 F20101221_AAAPXP subramanian_v_Page_013.pro
8b53411582a74b76d11473ee83aeb440
a0f2c0640e0d545f51b787afa27f05cec3df8c53
35936 F20101221_AAARRD subramanian_v_Page_310.QC.jpg
f65c4a162fe3aca0fcaf523fd9b8944e
64c15189b0c0b50dd7032f2e6d705a2cc4684026
F20101221_AAAQUJ subramanian_v_Page_067.tif
dd14bcec326718a011a1730dd8cd7e21
ca3d65e4242fc2673df35e2a42f6754b404b0306
6276 F20101221_AAAPKF subramanian_v_Page_068thm.jpg
a0cf084d0eb6840b0005f34090ba0ac1
1660387502d7a57d2388684f5529e7e72a1e2b53
1112 F20101221_AAAPXQ subramanian_v_Page_003.QC.jpg
21417e43ff4c508f07fa7985739d207d
03c61821405a9ab20ccc88ca85ab31d910d4e82f
F20101221_AAAQUK subramanian_v_Page_068.tif
833eda90a68e5976f7771e440413e31d
a0f6d9ebddea276b029a7629d6bb44af0dbd0908
F20101221_AAAPKG subramanian_v_Page_136.jp2
0c2bed6bb45ed2a9352349b89553e4f4
53ecf18dc940270c5ee8a1d1086775360a66a8a0
23426 F20101221_AAAPXR subramanian_v_Page_223.QC.jpg
be50833188020095f945ec97d42da67d
62df33abe6313dd6c40af37ddcef94d9bd615096
17409 F20101221_AAARRE subramanian_v_Page_079.QC.jpg
197d415949d5df91648ae3e0239aea18
19ecb0b403637924c48475fb55c9eeffe705b4c7
F20101221_AAAQUL subramanian_v_Page_069.tif
fda191210dd7b5e9927acc6139a6ef09
2552cb8ca4e8ead38fdb8b372b024b5cd3fa9672
13747 F20101221_AAAPKH subramanian_v_Page_092.pro
59139c7359301e47c4dc2dbcbf7002f8
cbdf990a963f9ac34037acc953693cf5d95ea16a
63575 F20101221_AAAQHA subramanian_v_Page_083.jpg
4f4c9fb01cca6ccc45fed898f48a4a8d
7f259f6309f1bf9f9f05346a33b88dde7f0c9e4c
22885 F20101221_AAAPXS subramanian_v_Page_288.pro
de609aa367c0acd0328d2e89bf2f4364
de43853682b614a3161ddb2fb00efd7f314fd429
15678 F20101221_AAARRF subramanian_v_Page_277.QC.jpg
e901c8d9ed67e86303a35e24947eefaf
944616b432ae24aff659ee5b53a343f85f13d8a7
F20101221_AAAQUM subramanian_v_Page_070.tif
225ef1f9e0c1e2116a84cb23e1b6fb8e
c5e31d541908e2183c9f23801bd8aca833b11fbf
32693 F20101221_AAAPKI subramanian_v_Page_056.QC.jpg
21c9f01cdfdb159b47303c6432f5336b
f86f1e10f7c0c60b34af4f4983df30cfa1fd2d76
88968 F20101221_AAAQHB subramanian_v_Page_084.jpg
4c69de296ec5376a076cb92ac7fa609b
258611e8dcbe8828c8b97c569e797be7b4708686
66145 F20101221_AAAPXT subramanian_v_Page_299.jpg
c00bcb8704cab9af22419de0db8e8f4c
371ac63e42a6dcd4ff25c4da9d3209f4b90b37eb
6594 F20101221_AAARRG subramanian_v_Page_089thm.jpg
eb08f7a36038f6b1d042361e7a54e966
8b56435e6fcc8c671ff05188b3feba8ece646491
F20101221_AAAQUN subramanian_v_Page_071.tif
70182194563c0d35855a2fbb95c0234b
44c25547c1469b68b0a25a8969ffc44546b4c91d
F20101221_AAAPKJ subramanian_v_Page_147.tif
3056f11908bb513cadf914d9854f6c93
3f90328c695e3ec7a1650bf136f367b33fffb220
102263 F20101221_AAAQHC subramanian_v_Page_085.jpg
a88b8aa669fbccfa0018498cf679e8e3
aa5cb78ca26f4966665c173aaa5cc938a6aa50ac
1051892 F20101221_AAAPXU subramanian_v_Page_238.jp2
8130d20742bbc9e784929d7c944072b1
474bbb6837fc76fc3330945ce39a9d803b3b664b
7981 F20101221_AAARRH subramanian_v_Page_205thm.jpg
e56427e3e6aded806ca2a38beb364091
44ac9822ef2968518f21f053f9addb03ba46edc0
F20101221_AAAQUO subramanian_v_Page_072.tif
60ff9052c4e4383d9f41b31730c049b9
9b1e542f5acb1308a716a8a3c01f43aafbdc430e
29096 F20101221_AAAPKK subramanian_v_Page_095.pro
d2ec9d9fd32ee069446fce2a37bd502d
dcf873fa793f2367e1d6309181ee0b76af934ce1
91345 F20101221_AAAQHD subramanian_v_Page_086.jpg
33e1a2044ed68697f2b6b15c6c3ca6b3
72800614e4f6e078f7461f8dc3a3ebbc27b1ef3f
623978 F20101221_AAAPXV subramanian_v_Page_222.jp2
b6f6c34ca1fa44a031878845846af33c
c9345fad3dfbfd5e4293323bcb9d24c0785939a4
12886 F20101221_AAARRI subramanian_v_Page_009.QC.jpg
2941da0af8045f8d94f54c3664aa4683
7539f68fc7f7746b3c8dbb2c61dd7ad1d9517dd5
F20101221_AAAQUP subramanian_v_Page_074.tif
44a73f4f5d9a2509c1999b87ed46e1b9
c3e08eb90f0fc4809b763bdc6662177588ca29f9
68287 F20101221_AAAPKL subramanian_v_Page_040.jpg
8c4302245c2e9e18f06ff03deb4bd976
6cd2d5158a23a29d2e75775717fc9f02dc4bf173
94644 F20101221_AAAQHE subramanian_v_Page_087.jpg
142aa4f303332725259ed7a6bc439b15
0fd64c61204828d05fd2043abdbd01da2fae51f6
53150 F20101221_AAAPXW subramanian_v_Page_264.pro
2b4db6ab5d3ba083a73acedee04de6dd
47b3087535d3ca22f7bc160092383b5f06f085ba
33838 F20101221_AAARRJ subramanian_v_Page_146.QC.jpg
9ee53167ff2c731be01e539d210c095d
5aa9f3f7a696ead3923594261a98be4a15e36160
F20101221_AAAQUQ subramanian_v_Page_077.tif
481a7474f30efad586477f449b8a494c
2d3190eaabe8a9d1a5133197bb9b7c46ddd127d8
F20101221_AAAPKM subramanian_v_Page_309.jp2
7089ee0209cc170f9568d7b8b945d6f1
4d69ebf511fb41a8fd7faa9799af201d2807937f
75555 F20101221_AAAQHF subramanian_v_Page_088.jpg
1fb9f89a72d5921bf48fc3d4b92093eb
309a0d5f294f81856fcc65db1404848b2a9dd066
F20101221_AAAPXX subramanian_v_Page_141.tif
12aec76d1fa3e27a39257ffaf780410b
7cb4a39d0099d7b9fb29854b0293ff1771335f1b
34882 F20101221_AAARRK subramanian_v_Page_162.QC.jpg
12a2dfea2e7637d30f5f17539d10507d
1dd21b0e1bc14bcf547db26d99e841589465f690
F20101221_AAAQUR subramanian_v_Page_078.tif
07dddc868d25a004a132c60653d5fee1
515a8c2a902b8c1d5987fe7b56a99ba3e48f0347
126816 F20101221_AAAPKN subramanian_v_Page_310.jpg
51276698fd5962f7b70b40b9c74b1b6b
0f5026d523731548003491902696a737c3f16ee4
68533 F20101221_AAAQHG subramanian_v_Page_089.jpg
545a7ded03f50ec3999bab30baf79dd8
aab5419ad2231380787acdcca11bc2ec0cf69cdb
1832 F20101221_AAAPXY subramanian_v_Page_199.txt
dc5d1cf4b55b1b20dc6cfa3a1f965043
ce3ee559459f024c654d5774b2a9872888351471
8573 F20101221_AAARRL subramanian_v_Page_235thm.jpg
bcabb6b0b8ef57e0ed664699ac272467
e38811e45e30df42ebebd217be8d708a4714c042
23338 F20101221_AAAREA subramanian_v_Page_176.pro
6ec01cc539581b7cb4135918c84656dd
174737ca375f75d377616edc1da8c3bfc2fd5ce2
F20101221_AAAQUS subramanian_v_Page_079.tif
27c32a685a0251b9460c1c5e99c0e659
3e044183774af78c13a310b0260f6761a259b2ec
16043 F20101221_AAAPKO subramanian_v_Page_058.pro
9314c36efe4a8e10601d20c2d9be791f
d98fa71ff1027ee413720d149d7f3a21cc22b5c4
77732 F20101221_AAAQHH subramanian_v_Page_090.jpg
b58b5780aa255d34935e5417162216e3
a245d3ac1137d364f0da476b104857a9a407f594
F20101221_AAAPXZ subramanian_v_Page_130.txt
3854fa43b8eaaa71c520227e886faa1d
40bfe9c93168f864312137fdda77076fe9a27958
9009 F20101221_AAARRM subramanian_v_Page_172thm.jpg
69d03976aea8b8936a8d58250a38feda
d385d815dd8c6f6def1bc95c734aa0a10a41a068
37431 F20101221_AAAREB subramanian_v_Page_177.pro
299a47f76b8fb9af1d18e2a3425e788e
8f0a3b749ca838b662cb443543cc54bd47e3366f
F20101221_AAAQUT subramanian_v_Page_080.tif
721ebebb1ea30c8d10a5c0e4eeb071ce
9c4507013ad62a7f3ad0eb77d12f1bc1aa347a4e
7542 F20101221_AAAPKP subramanian_v_Page_243thm.jpg
8e62b68d0428e50b4831dd3777f0da39
b25f333f9cf57dfb6aba0ea08b349891d1f124e2
56281 F20101221_AAAQHI subramanian_v_Page_091.jpg
7750b419d1378be8ba1665b5bd193024
2ba082a2ddc9c9590ca78a8b0f83cc11b22dbc3e
476711 F20101221_AAARRN UFE0022364_00001.xml FULL
5e41e07fef418bbacba2ad3ebc574483
161ade331dd23803a058e2d63d332ab5893b5213
45793 F20101221_AAAREC subramanian_v_Page_179.pro
9abf85bba37c0fe4f945292a9289768d
4ce015ea8640dccb3d469b8b32e0cb57356b0959
F20101221_AAAQUU subramanian_v_Page_082.tif
cda7f0bc7a480cdf03fac2e1ecbc223d
dcdfb1f76fa32d207a6c1236f0d503a33d42875b
921915 F20101221_AAAPKQ subramanian_v_Page_111.jp2
f9180e5329adf5e714269a8005339c78
8315d9bffa67bf0fb08dddf1247ae7f383b876d3
112933 F20101221_AAAQHJ subramanian_v_Page_093.jpg
d109ab51fc0664e90801aac31e3485d0
db7a3049892f4acbb168ff3ed7d8636b4f7a0d6d
F20101221_AAARRO subramanian_v_Page_002thm.jpg
6ee1ff39e12b5240714530c936ab2e10
23e0569fe2512497b81ea5caa267f47f03d081fb
4011 F20101221_AAARED subramanian_v_Page_180.pro
ffdd57157c120932eb040bcc683ffeaf
ae169c436a50be8037ac766790cfa1b52a95a3c1
F20101221_AAAQUV subramanian_v_Page_087.tif
3b3e8216897608d3c1d77496bc301ccc
d2734e09f217c8abeea94b9c7b569316a8d5416e
F20101221_AAAPKR subramanian_v_Page_026.tif
870956c65fc3dd3a6b0f6beebba70ecf
acc7a2fbf0588d1f9c59457c87d3510817efbb3b
113294 F20101221_AAAQHK subramanian_v_Page_094.jpg
31e443a0d2955ef44ecfa2ec4b0b5d44
2d394079a894994b791ce63c34333d11263b78d2
26332 F20101221_AAARRP subramanian_v_Page_005.QC.jpg
1dd33285b1c2867aa0d41698c2774482
5ba517be65fabfeccdc7c6d3ef94958d95403318
27692 F20101221_AAAREE subramanian_v_Page_181.pro
e33a4fe5455e74fe73ffb7f2849f65fa
6c6f6361da53f288b755a388e02fdedc2df23703
F20101221_AAAQUW subramanian_v_Page_089.tif
6caabe47217e1d68fd229ee581badcca
e0c7c3f290a7681cdadf005314e6d78c790768fe
31909 F20101221_AAAPKS subramanian_v_Page_062.pro
69781300b0291c7b4c82c9af3dca630a
9cc42d758db202e2092bec92bbf10b4af8c69a53
90834 F20101221_AAAQHL subramanian_v_Page_095.jpg
6f63192ff5447bff2ded0d2123703375
34af1da177296fbacd6365bae373b379c214f8b4
6146 F20101221_AAARRQ subramanian_v_Page_005thm.jpg
c2714f8cfc5741e249586e2522e0f7f8
8c237516e375c818d46bd65b38a5c344fc2f9fac
40946 F20101221_AAAREF subramanian_v_Page_182.pro
64867ad169defeac2f727fdf2a83ed63
49e648d712d78f828fb65c31ee5a2df58b3a0353
F20101221_AAAQUX subramanian_v_Page_091.tif
e49aa15287ff1586a92a29989a27bd85
838629438b9480f07596475b26bae22e2f5e3c6b
32897 F20101221_AAAPKT subramanian_v_Page_073.pro
f03f4f86c44fac04aba4e0cd9a5e108a
071c2a92f03a93d34c62712b8628994c212eb975
103314 F20101221_AAAQHM subramanian_v_Page_096.jpg
f98340112faf9f7b5d2f0f5eea2a0367
443ab3f6459a1ddac63d3ee0b90210644697e50f
32408 F20101221_AAARRR subramanian_v_Page_006.QC.jpg
49c7b5e4de1986f96c62c2ecee443d29
56a7af74af509d47407b85f2ad1b97c590d98a5e
22919 F20101221_AAAREG subramanian_v_Page_183.pro
0fab86e9f22edb74be5be9a0ddaf0945
d6167f3b9893e97b6b018f601fbb8ea903bec503
F20101221_AAAQUY subramanian_v_Page_093.tif
bed7e286531bf5379d3aff0fdfcbc633
e35fe0f32f3e43bf14cf2fd3cc91fa35e4d27727
F20101221_AAAPKU subramanian_v_Page_084.tif
aa161767977bcae89b556b2ed09f2031
3d766f6c3a2c666cc822c4c1ae70e4ec6febda91
120009 F20101221_AAAQHN subramanian_v_Page_097.jpg
0ff27e1c4f14b7b3b767bac53b90b46e
257e626e80b14e4c07d7713bae00e0682296b765
8701 F20101221_AAASBA subramanian_v_Page_250thm.jpg
341d77195a5148f4bc551b5dc659332c
ab5658275c4b5e0548715edb917e3453d31fbac6
7292 F20101221_AAARRS subramanian_v_Page_006thm.jpg
5e10366b7b3f985ab5fc4241fcd84581
e1ddce9d9097c78793ffa5e09452dd37223911ae
40144 F20101221_AAAREH subramanian_v_Page_184.pro
6be4b884be1d34ce353babd076f74438
5288bd61562e20d7584efc9031d400ae32b7d0f4
F20101221_AAAQUZ subramanian_v_Page_095.tif
437a5b54cc3afd0cc5c013d6e842eb30
b8427e1fda67a2c03f26ba3c9922fae5c37cc5cd
986620 F20101221_AAAPKV subramanian_v_Page_166.jp2
9f3b98aa9c6ad7b366eaf1732aec8faa
73e9aa3cb429ae269a60ce08f17143e9d08e117a
74575 F20101221_AAAQHO subramanian_v_Page_098.jpg
f1e863abd9904ed13d251215d3193362
c7e8b5fe249ee30c6acb31e600537d2d729df65a
5527 F20101221_AAASBB subramanian_v_Page_251thm.jpg
d2a7550d93afa58a58d19ed2fb5d886e
dd416301d26299eb51e4917e9071fb9e4aca1b2d
2926 F20101221_AAARRT subramanian_v_Page_009thm.jpg
8538f0a5c937c0bf111a0a6c01eeb63e
857035c282cf567880ffcb53b3ee3b0f6b77f486
27912 F20101221_AAAREI subramanian_v_Page_188.pro
3a322d275e04387db5cfff770af03c79
c3af6fe8d809d0f7b5e22590f5ee2d31c7e91e96
23387 F20101221_AAAPKW subramanian_v_Page_068.QC.jpg
9fcd5105678b4be0528a230ab7d19394
a92dc0ee61096dd13aca4811fff06f1ccb8e496a
54641 F20101221_AAAQHP subramanian_v_Page_100.jpg
0b49d70365111ea88acb3f365197d1a8
fbdf7b0f565b83726c07191c95cb5b73fca490ad
27169 F20101221_AAASBC subramanian_v_Page_252.QC.jpg
9ef937f57b7532c3a659550a123508db
4087128ecae6c169b1c5f3a3ecb44e8e2ac59ceb
27130 F20101221_AAARRU subramanian_v_Page_010.QC.jpg
8ed0ad77d8b20ee3cbe3192d505b23d2
61c6c28b7db2e0904705f8f69f70fc9ec115906e
49386 F20101221_AAAREJ subramanian_v_Page_189.pro
443bc6789a3cefa0affa0c3999dcd66b
e4d2aef4a00a13b04b7c384db49cede9cff51b51
F20101221_AAAPKX subramanian_v_Page_133.tif
d3dd4b0525ab6ae1c1aec58af45653e6
92b537aab02ffca049ca190adeb9dda834b13877
75323 F20101221_AAAQHQ subramanian_v_Page_102.jpg
c6d84611043d05095edf4f02a4e02f3e
144421e748f3a087cb1472c09bb5596eb0f15276
7084 F20101221_AAASBD subramanian_v_Page_252thm.jpg
e9c1db392ba847387d567e25f1951648
0d3261a476eda0d3c2f54e16b9a6dacfa664bf47
6462 F20101221_AAARRV subramanian_v_Page_010thm.jpg
ca6816952bdcfc0216379c17cbda5538
a2a6a2206143674d3e726fbd68e8efda18f9959a
42444 F20101221_AAAREK subramanian_v_Page_190.pro
667f3ccb8226b1c6319e533409b41c9d
1f7164b037c4dea27f150c819d27567ce7eb7193
9115 F20101221_AAAPKY subramanian_v_Page_309thm.jpg
bd0016795a707fb4fbef6c0cea9c3a40
46ac044a5981409473b049829075795879f4cc43
102037 F20101221_AAAQHR subramanian_v_Page_103.jpg
e5c0bf6fad50c1a932f8970006e65b8e
c1b386629486f986188c2aced81fd9e627d7b998
33087 F20101221_AAASBE subramanian_v_Page_253.QC.jpg
1c23d1540b0c05dc8a66f76b53b926c6
77b53150d2b6a4419f01dd25ce2758fac6ed00e2
7907 F20101221_AAARRW subramanian_v_Page_011thm.jpg
2620fced995d1869401eaf63f5afb2c1
8b3fe6ffc6102004b5f99b3f99274a2ed09d210a
50048 F20101221_AAAREL subramanian_v_Page_191.pro
61b4ab43566df1cac94bed856027cb6c
05f60afa2a686033d2179a4067379b29a510d90b
34159 F20101221_AAAPKZ subramanian_v_Page_044.QC.jpg
0f0bcd349bb0a55c0582e9752facaca0
200c72e7e7317eebe2bf8b9ef02b8279cb183c22
38290 F20101221_AAAQHS subramanian_v_Page_104.jpg
f2d0fc8ef1414b866bcf1ec6bb6b440b
526a2841016d85549503fdf6e9448a644ea07361
8581 F20101221_AAASBF subramanian_v_Page_253thm.jpg
9f57491c74f37b5e42a96aacf6a7d4b5
5c36daaba68a9df2d777a30c7655b86e99aa0263
32396 F20101221_AAARRX subramanian_v_Page_013.QC.jpg
9b7809315de71bc074ee10fb1a8a4a19
f5eea9a108e76d93fe99f04fdb79be3f7ac942b8
54443 F20101221_AAAREM subramanian_v_Page_193.pro
a5019d5ab14d38a63eded6aac7197ed8
7efdc68a740f36255088d1c3d0880aa3170a6dc4
94399 F20101221_AAAQHT subramanian_v_Page_105.jpg
5718f8b635e4306372cd6222fe6ec1dc
3c638bda8a9b34c5b49e3c9942f94db7af9d41f9
35234 F20101221_AAASBG subramanian_v_Page_258.QC.jpg
d05ad411d9bb56e03119d284a4052be8
6e3aea452626f01b4609f66d2d1cab553f1f2401
35284 F20101221_AAARRY subramanian_v_Page_015.QC.jpg
535db6667dba594f5e5ac720580822a4
103d14273290c6da4e3ba1128177c3d2c41f39d1
54500 F20101221_AAAREN subramanian_v_Page_194.pro
d9866e179fcd05ae2ce89479924276b7
d24a8607761e7f2e9e7214aab68f6e368ccec63b
112168 F20101221_AAAQHU subramanian_v_Page_106.jpg
620c2a341751cfd873b54d2dff99531c
c4f411a9ed3b4b90fbdd7983753eb05f032a802f
8843 F20101221_AAASBH subramanian_v_Page_259thm.jpg
5594492e9118fedb3c762aa3a1917fcd
d8b196e65510287dc19b35c63a93ec02c7c747b5
2759 F20101221_AAARRZ subramanian_v_Page_017thm.jpg
aba7affacc541f69482238930882b11e
62865f455575a4d16a169fadccfb30a97d158b73
47107 F20101221_AAAREO subramanian_v_Page_196.pro
f308ae8d5e8c1e7e960c28f27623655f
4752ffaf6f7bc9c0e340b97d55abf43ad0b38441
82889 F20101221_AAAQHV subramanian_v_Page_108.jpg
4f74e73815a93005579251b9b327244e
07187c1c2566ca371cd635a79a78df6f76cdd21e
31421 F20101221_AAASBI subramanian_v_Page_260.QC.jpg
a26a33ea3d916efed7f0cddf686349fb
4e5385d25a7c41b669cf8fd52f021430a77e6b7c
35962 F20101221_AAAREP subramanian_v_Page_200.pro
841d86e41913dd994751a2d707185143
6131ef9907cbc3aa722e9e513fc2a3d107ce0e14
90664 F20101221_AAAQHW subramanian_v_Page_109.jpg
668d79bd690f1a99555f416de771758e
0e2b064a89dfb8d86238735856aa0ca1a85debd5
38189 F20101221_AAASBJ subramanian_v_Page_261.QC.jpg
0f5904d592c6e4c6e026b8e4c57fcaba
0f959f00a6dc9f5758df75a87356368954fca1ac
44662 F20101221_AAAREQ subramanian_v_Page_201.pro
8d97375dc2a75f86814c210103929501
2dbb4705aacdfbbfb15a3765ba31b1399eb2e110
96464 F20101221_AAAQHX subramanian_v_Page_113.jpg
51651273a897cb240afbb6f066f8ec78
d44a18458e220f03fb447ada08ebb4e20bf66d72
9319 F20101221_AAASBK subramanian_v_Page_261thm.jpg
cca9cfaec666973e52aecd7e541c1a0e
b1f994cc694cb3929ebe18d11042236ce98378bd
43069 F20101221_AAARER subramanian_v_Page_202.pro
d17a6b812cec7db64f6df8aa76288a78
bbd1b09ebb81d9496fc6212298104f226c4acad1
99657 F20101221_AAAQHY subramanian_v_Page_114.jpg
e7ee8e6a1c52e398ce2a3f3f8c1e4c73
0f076c812c5c161136a188d95bf0e742659ac0c7
18670 F20101221_AAASBL subramanian_v_Page_262.QC.jpg
7060ea15aeab9c2a1c11f3c4c3cc0dc5
a6e28c14a91964ba4f8ccbb16ae2043679cf57a4
51700 F20101221_AAARES subramanian_v_Page_203.pro
ddef9448e83cb4afeaf844fd16498bb2
f7cc6de3ff107f65a2e3dcf3af0234e50a616337
112682 F20101221_AAAQHZ subramanian_v_Page_115.jpg
b1ea6e6d421115f3d1d8a374413baf6c
646fa5883cd9f09a461c3a87cce5f5e20aa959a5
4569 F20101221_AAASBM subramanian_v_Page_262thm.jpg
e530e490a7d6d004260b7b07b0f03184
64635b916e17783d2291844a81b726480e10ac2f
48305 F20101221_AAARET subramanian_v_Page_204.pro
4281b12da342e8dc065361df251b8a0c
d533979c71f7435ac573b2936ed34bac12239a73
8782 F20101221_AAASBN subramanian_v_Page_263thm.jpg
8d483b016b8a6860deacb1708c48502a
94cd34c77adab78dfa01d00b4a411c87058da75e
40271 F20101221_AAAREU subramanian_v_Page_206.pro
19ae50ede0c1a099e32462e4a1a06032
f931c2fa40884b9597ba77a3824232e11dbc6148
8553 F20101221_AAAPQA subramanian_v_Page_023thm.jpg
b4de51fb01107b6acb48e684688913f1
e360ffebcbf800c89c7dec01d55d31fa0d07e1da
8675 F20101221_AAASBO subramanian_v_Page_264thm.jpg
9bd8a4cd6db7c69dbec13217413e102f
f326cb457e71ebf64d741a4e0520b1b860f3c645
48416 F20101221_AAAREV subramanian_v_Page_207.pro
d8dd5f9c578ee4ce19e149c5b6bfc016
62a0246d191d38f8109135289891925c24dc16e8
36359 F20101221_AAAPQB subramanian_v_Page_263.QC.jpg
829da32e66c04695321f9f79c2964e8f
40ed4fa4c47ccc8fe2f33fb3d0656ec140e0b7c0
32822 F20101221_AAASBP subramanian_v_Page_265.QC.jpg
1cb350b8e85a676cd3c3940477c8906a
fd9035366da18be252a56e1ccb58e0ac61cab5e1
57310 F20101221_AAAPQC subramanian_v_Page_160.jpg
c89065ebeec10cf3bfe9fcc20579f95f
dac7ce7977ee12c748860af925c78c01227ca916
42708 F20101221_AAAREW subramanian_v_Page_208.pro
bc06da7b1d4b731d901ca8a2f93f8651
495acc5c3b1f717a1db00e7148e4500133c6eb4b
8398 F20101221_AAASBQ subramanian_v_Page_265thm.jpg
e742a34eeda61a0697a2725fd7561940
116d588507e1bc16f8d30cee6efc37123d6b21ae
7972 F20101221_AAARXA subramanian_v_Page_141thm.jpg
39253f1d57914f8284132d36be57c24b
bf62d5ca7fd94dd8187816df58aef1afb78c111a
20655 F20101221_AAAPQD subramanian_v_Page_083.QC.jpg
db2b050f451343ed1e759607477c8fd1
263b3f14d9f5b8e41e0c1aa655933e9ccafc380a
8297 F20101221_AAASBR subramanian_v_Page_266thm.jpg
b2416d9112dec0cf90c4dd48d8836bbb
45b8d8cdf7ef5df79e245ff202678a471dca3ead
7419 F20101221_AAARXB subramanian_v_Page_144thm.jpg
9c3f58788366295501c3c77338f83bff
f5f325912033ad7205d992d4cedd3abe285bf35b
101787 F20101221_AAAPQE subramanian_v_Page_229.jpg
aa8f959171850d7f2f460d20edeb70c0
9aa31da0d2c68a09725bcbb08a22a53efa97f34f
41278 F20101221_AAAREX subramanian_v_Page_210.pro
72c1474b9a06d36eae1f5bd889e85e92
05033a4909ce615c211ebcf4a3285f2392fbedc8
33203 F20101221_AAASBS subramanian_v_Page_267.QC.jpg
bcd086c5d53a6b79941e5323cb7f4b9a
f17cb1135164ef4b6ce7ca0f1ffb7f0780464a3c
8883 F20101221_AAARXC subramanian_v_Page_146thm.jpg
d1d3b99d6ce80d2a2323f2f2ed3872fe
538fa030f8635f1c5ffaf8eeadea3091482b4254
47422 F20101221_AAAREY subramanian_v_Page_213.pro
f4781a2ef04d6e0a3b6aa6ffd4bdc43e
f51bcff1a524e35855c24770badec92d7da4aca6
37308 F20101221_AAARXD subramanian_v_Page_147.QC.jpg
9fb895ad58d73ed4dcfd62e76fc0bcd7
ef38cd0041a70f3af59901fa7d9bd938bcf96ac9
8835 F20101221_AAAPQF subramanian_v_Page_312thm.jpg
e8b1715a80b4cc01d3bf26bc701f26e0
d1c791d6587198471d1a28ec9a73fc1407529a24
16777 F20101221_AAAREZ subramanian_v_Page_214.pro
23420aafcc2f5008f8b4df8aba399652
2f659f77acacaec90421ee5fa4bb1d97c14e52c7
8833 F20101221_AAASBT subramanian_v_Page_268thm.jpg
a3fab424b96e19f25e26cf7a0e3c1942
900e3214574c07e76bde2199a9b8266f6e571ecc
36171 F20101221_AAARXE subramanian_v_Page_149.QC.jpg
fd070f14d2414eb703e92fded325d58e
f9691f0643f8b36933b6a8d7397053209d7bb7e9
548911 F20101221_AAAPQG subramanian_v_Page_289.jp2
945f784fa4c2e168871f25225417007a
ae2ae1234fa83654d66fb848fbfcafcd03275823
1051963 F20101221_AAAQNA subramanian_v_Page_044.jp2
a0d796f6e8bd1ce134ac8ae9cf72fd4b
87b373c98da7372130c2b9806dd1eac172b83860
4459 F20101221_AAASBU subramanian_v_Page_271thm.jpg
9224052f39085f8d2207a934d511ef0f
71ba6cee22259e65a185a3fdc4b0dc955df554ee
9173 F20101221_AAARXF subramanian_v_Page_149thm.jpg
8ebafb2709a8353225083104f9d3cff9
8c35a744f1489e873ee56ded779112247087610a
8918 F20101221_AAAPQH subramanian_v_Page_192thm.jpg
2c1d415d1f7f175f7d8471ed83242104
b46b7c0fa59d05b6dd7685e01657a9a3d1c2bf76
3815 F20101221_AAASBV subramanian_v_Page_272thm.jpg
01f1a62a96f5687af555f6bdd3c98397
40aeb79290d7351bac3ccd08a61acc8682947791
22185 F20101221_AAARXG subramanian_v_Page_150.QC.jpg
08948d647ae347f8cd0951f50b44e54e
5a46ccefca8358fb791548108c2683264ad82b82
8950 F20101221_AAAPQI subramanian_v_Page_237thm.jpg
4ba1b7292ef430647c74bccf350d87a9
dd3d2a5f1ffff59f1ae11340e1b1433744b76425
F20101221_AAAQNB subramanian_v_Page_045.jp2
56297a1e8767c9d13a98064ce5438ac9
6f60587f2c8914b5c9b84ec9d7016e7e3682e458
10231 F20101221_AAASBW subramanian_v_Page_274.QC.jpg
921e0e1ddc67622e4be3cb4307e1297c
35ffc4cc38cbc6aec0603007c5e95a1b3ddc78d0
5665 F20101221_AAARXH subramanian_v_Page_150thm.jpg
adced808dca37a54e4890c9c7e889f67
5e90a9bc8ac4f536cea76eeceda1eaac457a155f
6294 F20101221_AAAPQJ subramanian_v_Page_065thm.jpg
ee1643af87a142e00c68b70ea2c0e92f
498cbf777f13f04a0cb84226cad58c9181aa46e6
1051934 F20101221_AAAQNC subramanian_v_Page_046.jp2
5d3f9e1e3ed00aa53dc320beba99daee
71c0986741be0c175a65a91044ecbd9fd8eec9a3
1373 F20101221_AAASBX subramanian_v_Page_275thm.jpg
6e95ba492d983bfe8cdea338929b85e8
f063ba07bc85b8307c9f08161cd0b653cd8758cd
6748 F20101221_AAARXI subramanian_v_Page_151thm.jpg
6a23d7af3b6a98da2c73924bab0db0c9
62cb617b9177238775e350fcb424342909b54f62
F20101221_AAAPQK subramanian_v_Page_270.tif
73e5a43e94f4736bd0cf3e3f43c9855a
72fe03251099e3ba689406a96dff98f059dcbffe
F20101221_AAAQND subramanian_v_Page_047.jp2
4689adc39b29a27707be29deca8127b3
92a99c13cdb5428faf5e10e95ea921f9db03c8cd
4276 F20101221_AAASBY subramanian_v_Page_277thm.jpg
69734039359898e1275b97c33db70ae7
db16c2e7f14419c179faa07d1d0b29a6e348d4fc
6471 F20101221_AAARXJ subramanian_v_Page_154thm.jpg
a5797ecac75efcb41eefd7365a967208
e4807336a9c301879be140761a8d53909c33156b
23751 F20101221_AAAPQL subramanian_v_Page_133.QC.jpg
0a9e6a1b9dd15768b61cbfc1ba55c1df
014c753a8db50f2a8b8e25d3d491bd645b773966
774235 F20101221_AAAQNE subramanian_v_Page_048.jp2
b29d38bd01ff765b1f060614c9213ce6
4c9ec237f91813e1eb374811dbb0e3f677bc738e
34000 F20101221_AAAPDA subramanian_v_Page_228.QC.jpg
b35091c0f654ab3f1beaab5b43d77005
77d9cf944874ab6fc03c32df34596eb37660b155
3619 F20101221_AAASBZ subramanian_v_Page_278thm.jpg
ee8da7e12e63bb98955ca396b2ef2313
7ee24092c496ad14da2b3ca1ea603780dec22a57
89731 F20101221_AAAPQM subramanian_v_Page_062.jpg
a731ecfbcd6ac81fd3676539de602dc4
94cf6fc7dcc6a18625fe2998383f8048f576411f
1051923 F20101221_AAAQNF subramanian_v_Page_049.jp2
9437cc1151645976d5040785ab656da0
2ae6bea14d7824d8d0e2041c8a6cf44a9bfbde19
1431 F20101221_AAAPDB subramanian_v_Page_284.txt
b5a8bb45cb4f14e01f920a6a59dc0977
a229853a015d967b5229c1f99f6f44da5e880d1c
25478 F20101221_AAARXK subramanian_v_Page_157.QC.jpg
c06bd4a5aca25ae0c90ba55c512bcf70
32d7b2e8460c15f23b29b889451603a8e53afb14
122573 F20101221_AAAPQN subramanian_v_Page_309.jpg
51c67033d379406765ec74bba3b1af5b
55c68fda02010407623a255eb82623a01129f7bd
F20101221_AAAQNG subramanian_v_Page_051.jp2
52597871f82fe66d35a489042de1e489
f7d745e8b7c6b33f2a0528523374a228d0896a84
34767 F20101221_AAAPDC subramanian_v_Page_050.pro
d9056c54e04629be7737ce5c922d894e
b995f3fe145a3ed542cab817e760b602b97deeee
3516 F20101221_AAARXL subramanian_v_Page_158thm.jpg
9b07cb84b80fdc7bf66a1dc0a42b3e06
405bde2cd2ce773d16dac918d083d0101bdf46aa
2041 F20101221_AAARKA subramanian_v_Page_112.txt
ce6c9211cd39eb70e6968aa984d6623c
d460b5fffda12a390897bad9499d1c7bf494159f
9137 F20101221_AAAPQO subramanian_v_Page_163thm.jpg
c407bfa70274edface2a79be77274914
6c837242c96a72d9a6b569f80332c22e7b434764
1043439 F20101221_AAAQNH subramanian_v_Page_056.jp2
c3e1d5489b94e6673cb6b26ec08bfc82
f79ea6dd051d2fd595933506ce166e4f0b82dcf6
112076 F20101221_AAAPDD subramanian_v_Page_149.jpg
d5f351e95b43f14e376fd88d63260253
71f538b0a5a7f26f79f36c5b6b100d298b1465c9
7926 F20101221_AAARXM subramanian_v_Page_159thm.jpg
655a88ed7bb15c996ced7aae0ef20a73
8bd599e3ab41ee1d05ab095872aae42e8df0c6aa
1887 F20101221_AAARKB subramanian_v_Page_113.txt
68e6fdf2c530df1458a0416ae68a4ff7
27c554e9c3401f82bfa5cdfc3201ba6af8bafefa
814 F20101221_AAAPQP subramanian_v_Page_278.txt
0b96a5fd67e63cfb70e11a1b4cc80ec4
63ab25dfe2a023b634fd0df987e056c59ad36629
743064 F20101221_AAAQNI subramanian_v_Page_057.jp2
a0b95c1a0164da17f408eed18ee251d6
20f71b5d91923ea23ee05eb381937dd84f5ffd31
467804 F20101221_AAAPDE subramanian_v_Page_281.jp2
3bdc58023854d8a33c3dfef71edb7157
cd096b96c456442b72d4db1f4a10b0588d2fcd72
8697 F20101221_AAARXN subramanian_v_Page_161thm.jpg
3589e891a245c6132f575416e56578e0
1ae16608fe7a99c52127997809c57926087f2ebb
2002 F20101221_AAARKC subramanian_v_Page_114.txt
d4db07902a870383d8b0a12f6eee816e
24a531fd78ffeca814107e93e447f81ffd2fa2ea
30721 F20101221_AAAPQQ subramanian_v_Page_031.QC.jpg
1323b48ced3570f03ec2b9ffe09f0354
e5a9a44bd6399fd632c6056d354abbae3cdba524
770875 F20101221_AAAQNJ subramanian_v_Page_060.jp2
dcf0ff34d00bc2086594e7a86463cc8b
7295f62bb89b5155390fe77dd479c267b81a5e5d
107231 F20101221_AAAPDF subramanian_v_Page_122.jpg
b7942427b3f90bc77759858dd0989e7c
84612f0a66334d51583e6dce61fc2b1e57a68b97
8805 F20101221_AAARXO subramanian_v_Page_162thm.jpg
00d4da8d0f7102437b73b8896acc9906
a4ecdab47f7e02b84c943f2e7cedf915ed6cd12b
2221 F20101221_AAARKD subramanian_v_Page_115.txt
f9f79bf3940110de409cdee0c617d0d8
681759d32fe2142d2df2370061f9b6157dbd2e88
4390 F20101221_AAAPQR subramanian_v_Page_006.txt
df66552e70bbf77aab67c9277a05a3c6
8b5345c26a626bd9e2f6fc7e53eb9a36c9d3cf1a
891831 F20101221_AAAQNK subramanian_v_Page_061.jp2
287be3a49039fb249ff621c7fc38a319
09d76682b33eee1d86a95465f9fdcb7b1da0727a
F20101221_AAAPDG subramanian_v_Page_100.tif
e43c46fabb5b5a3e7ba9470ec89c70a1
7d7f6e71f573ff862a5852a0518bc836bfd70c45
35729 F20101221_AAARXP subramanian_v_Page_164.QC.jpg
d34d0735704081ce340d206ea74db70d
12e1b3cbed70f011b02efa0faa27e43821fa8e3e
2079 F20101221_AAARKE subramanian_v_Page_116.txt
5e4043ef06536904eb1e8b777bffa5d9
c6c531816502279b916a507ca0a0e1f5d7f9257d
76831 F20101221_AAAQAA subramanian_v_Page_063.jpg
4684aa13a435fb449d284073382d9ccd
6f10fa165fb81926eeaf4c2e8a2ef2493fc91a75
7980 F20101221_AAAPQS subramanian_v_Page_105thm.jpg
69059d384e44effdcbdf695107b29c36
3fa7093dac2491832b4131a511454ed446cb9248
949227 F20101221_AAAQNL subramanian_v_Page_063.jp2
31a4d43526c1b33ed4b4ff2b660b2265
4a24dd9838126e05e4e7006ea26a011ddd1bccda
649216 F20101221_AAAPDH subramanian_v_Page_284.jp2
2dea682ca02f28634704a779aab1498e
5dc6a57bed2670f7293bc608a1348b22fabd7b23
36474 F20101221_AAARXQ subramanian_v_Page_165.QC.jpg
6d416dee498e0c8cf48bf3b70fb2727c
e48c1b13090d9cba18e24f1978cb0f76903f8b74
F20101221_AAARKF subramanian_v_Page_118.txt
30092750a3bbab18a106344a65d7ae36
54b6e039ba37254b8f7d57fa790c26675f083016
44807 F20101221_AAAQAB subramanian_v_Page_199.pro
c5cb24add28dc22b83ae7a36c49206bb
b1c5ce6fe793a33d68385b9c0b2b515939549c51
114871 F20101221_AAAPQT subramanian_v_Page_147.jpg
32771efeec4eca25ca22caf1bc67028c
1cfc9f72687224ded8f9f74871ba641f9fbe48ad
693242 F20101221_AAAQNM subramanian_v_Page_065.jp2
9fd4a3f7deb817445557c637a19f087d
a4f41e6e971277eff9eb641ebfc17726a7281ab6
F20101221_AAAPDI subramanian_v_Page_171.jp2
dc696ca08cf90647f3ba52322f50ae0f
881a61d8c4a25e28166cfa951009495d35061055
8996 F20101221_AAARXR subramanian_v_Page_165thm.jpg
542c6f20ba33abf481c8b74219ff16ab
cb4c764970b2ac0a318c1846f3fd5f538d380bce
F20101221_AAARKG subramanian_v_Page_121.txt
6cde517c50e43614a7aa0ba90c62343f
83f395fc04d9850a0ca371021a401658687b13eb
12142 F20101221_AAAQAC subramanian_v_Page_314.pro
9ab34c0c13a47e49b5e19fd4c2fe8e1d
84175cc04161dc10d6f0604106521977b5642551
33455 F20101221_AAAPQU subramanian_v_Page_242.QC.jpg
6d696a23df4d6b9f66b1532f2b4dce66
89f6d3e0ceb6e5e0e40f894c51ceb5410de98676
1051974 F20101221_AAAQNN subramanian_v_Page_066.jp2
194733fe854c7e7f0cec9ab485fb7e2b
6fbaec118b139852100598c4183a14c1c9fa246f
547073 F20101221_AAAPDJ subramanian_v_Page_269.jp2
e6974e113a4276618c7126b023956380
0ec6d44881268832131cd1373a02f3773791f370
8499 F20101221_AAARXS subramanian_v_Page_167thm.jpg
8191ed9e514000b6b83b7924ab6466c5
841d8459a58345dfe8aa8bd030fa7f7c22e18243
1602 F20101221_AAARKH subramanian_v_Page_124.txt
97e6276ca12cea4420d7cfaa089af632
3b54ef5a67bdb5c147503d01c06d2fa3b52dbdea
1204 F20101221_AAAQAD subramanian_v_Page_230.txt
c5566412be476c2575206013ca01799d
ba0795f0bb87ffe127d6e743c9b942b82d673c76
F20101221_AAAPQV subramanian_v_Page_032thm.jpg
b76f9c81e98850dc9ac5e721e46fa57d
b90ab918e2a18c7979bc145a832bdbb8b6114c1b
F20101221_AAAQNO subramanian_v_Page_067.jp2
a5a67c6306323ddc91496e8535a59a21
deb61574182875e9142baca0304bbf14727dbc20
806244 F20101221_AAAPDK subramanian_v_Page_200.jp2
3f6f1205f804b7f6f176b21606100914
55eaa85900aea28dbdba0811d4ef9579b079b58d
38469 F20101221_AAARXT subramanian_v_Page_168.QC.jpg
c1238ec07ce2c17e795812b01f794a98
ee87074a5fc268b1bf1a5610f84dd0d2b07329c0
1633 F20101221_AAARKI subramanian_v_Page_125.txt
f79b5b3bb634fa7d3d67457e058677b3
e12bb1deb014e620ec48e5c06a61c1f0f59edaf9
8065 F20101221_AAAQAE subramanian_v_Page_012thm.jpg
1108660bf5f27b7e981ab4d183664963
de4cfb2a0f6ac261ce459f7eabb38127aa59eb7b
43851 F20101221_AAAPQW subramanian_v_Page_292.jpg
6be2a08579a81128d63f8451b9b16633
5103e286cc9b9105b13f7d6f06892daeebf49d31
824391 F20101221_AAAQNP subramanian_v_Page_070.jp2
e59e89cfec38662bb510f7e03ec0c1c3
1ea082517183803fa0ac0bb0e477b4d9dfcca3ed
36294 F20101221_AAAPDL subramanian_v_Page_161.QC.jpg
d96d66ae30a70db0f6b9b994cc892aa4
75e093eedc7eb3eee7d8cc8b35a01f60f7d58108
32025 F20101221_AAARXU subramanian_v_Page_170.QC.jpg
1cee4e38777ce9ba86adebc81305ed09
97d67b5dea01851924fb7ea0464072596b4dfe6c
1159 F20101221_AAARKJ subramanian_v_Page_127.txt
b2cf1d355d7339cbfee68e7c465ff20d
0aa127bc75c83927d10959ac695c847de8f15bc2
7738 F20101221_AAAQAF subramanian_v_Page_007thm.jpg
32c56572ee9f1fe76ad5a08740ce5ecc
23fd7a883f09f268771f7cf3469e2e0d3a71bb38
18969 F20101221_AAAPQX subramanian_v_Page_291.QC.jpg
39ed9f78c37e47cd952e3aaf870b551f
e56159ee4f36bb27384a4bf49a1181242ffda9a2
904346 F20101221_AAAQNQ subramanian_v_Page_072.jp2
586d258f3f6caac2acfcdb68f233a1d3
6cf9dba41bdee618b33bedcc66176128a062c5f3
F20101221_AAAPDM subramanian_v_Page_169.tif
a9a2c9e6067f687f27e8ee255cada477
f4904cd5cff1c78eef4a9c1e0a6937a1ef7af1a8
F20101221_AAARXV subramanian_v_Page_171.QC.jpg
474170c1529eb53d0e5f842f57dcd0e5
df38ca36b1c3127fc198221f2814bac751bb80c5
2236 F20101221_AAARKK subramanian_v_Page_128.txt
c7dfc36a152ef08fe3faab81be0cc410
3319b44742c880403382e63e9ce4fdb9904d055b
50389 F20101221_AAAQAG subramanian_v_Page_028.pro
d7b224cd426cee2e58ca1243fc31e215
1f051d8a3a6878b8ebbd9373ddb830da69e54bc6
F20101221_AAAPQY subramanian_v_Page_146.pro
d1e9a2a8ef31f47d9b5e5bac0e988e38
3efa6916edf2b243a9f28c6104267d718af8294a
652728 F20101221_AAAQNR subramanian_v_Page_075.jp2
82485c954d31ae98a97d144e50f6d791
fba581383442569110ad1ed9794532943818cc75
120217 F20101221_AAAPDN subramanian_v_Page_164.jpg
cdb4319420897aad04bbadffdfdf8d04
7db1d13156d779a45bc915321feb5ede24c9d515
8375 F20101221_AAARXW subramanian_v_Page_171thm.jpg
8cc1657308966dfb4734477e354c8ef2
56c4fd4ecda18da741e0009b9350052bc33358d6
2068 F20101221_AAARKL subramanian_v_Page_129.txt
67d368b475d2859415878c5fecc438ba
d814f1160ee8e884a23e7a50f0d3dc2daaca5519
77995 F20101221_AAAQAH subramanian_v_Page_121.jpg
d1402d856731b89b58043685db4fd0b2
3904a0341e57e4f84b20541663c30d5ec83b49ec
116331 F20101221_AAAPQZ subramanian_v_Page_033.jpg
dc7ce5e1b2c47d473bb39d59c96b322f
0781c37a3ab0a04c4eb55bca02ec9c4c441d618b
602746 F20101221_AAAQNS subramanian_v_Page_076.jp2
91387368edebab88059d2b617aaa06be
a91b9b2df5fd704af9a53a2581ae5d03f13342bf
512681 F20101221_AAAPDO subramanian_v_Page_256.jp2
5dcefdb1789d9604d5c6a04d41e0ece5
d92b505b021bcad590631848d46091c055601721
34316 F20101221_AAARXX subramanian_v_Page_173.QC.jpg
bea3f8e7b444e87ec7f70feca06d441b
ff8e8b5da84570f2016bfa796802529ec8044e14
1654 F20101221_AAARKM subramanian_v_Page_131.txt
effc7826eeb0c1e4441f30c78d58921a
28a6cc419bc45afe665cc0dc1d3deef795d367f8
7453 F20101221_AAAQAI subramanian_v_Page_246thm.jpg
6ad34d90b625eeb82a004d4902e0160a
e392da87498660d8707958ed649a47efb4f943d0
891337 F20101221_AAAQNT subramanian_v_Page_077.jp2
b64d89379ebc7ff3e3a70c85d407d7b9
b8c128132fecf229ce04be4a3af1da2f674beead
7557 F20101221_AAAPDP subramanian_v_Page_118thm.jpg
2d092475af63edf1d2ba25d79e883e25
14b0ae190a86b86ede3fc5fcab56d4b659280365
F20101221_AAARXY subramanian_v_Page_173thm.jpg
bba448724038e18431709b61fb34a2e5
5915aa96df460b63b530fc5f0b452ef803253ea8
1038 F20101221_AAARKN subramanian_v_Page_135.txt
a02c5eab3e60b312702b9bb18850722f
0519477341be82d6945996a677b0130477fe26ec
1023641 F20101221_AAAQAJ subramanian_v_Page_176.jp2
7837f90cb085148f92dbcd6bedf6d6ef
2cf7fbc4bccf44890a4bfef0ae7067ae9d5e30be
824839 F20101221_AAAQNU subramanian_v_Page_080.jp2
162baa1adaf41e40cc76bc068c3f7abd
091294313ff82eac923076039b985b3943767fb6
F20101221_AAAOZA subramanian_v_Page_007.tif
98dd8bc182ae9c33c5a3784b0efb4f54
626ef3473fd09a3a854db08e266694a0eb4fd2e3
7756 F20101221_AAAPDQ subramanian_v_Page_101thm.jpg
55b8489206929389e27727906be16932
c40db7e1d4dfa57bbffcacc83270b02f2e83318c
30125 F20101221_AAARXZ subramanian_v_Page_174.QC.jpg
5782cf3146ed7af0312a32a9dd7528a6
a1eda1b905aa105195b459cea972924c59caedef
789 F20101221_AAARKO subramanian_v_Page_136.txt
377ab8bbaa9585c5f7be8836fe2d0a98
c7b53cd084a8bfd62e5ccd6ba2c7f99a2318090f
6764 F20101221_AAAQAK subramanian_v_Page_153thm.jpg
bcd178e1b6337337882942fa170813ce
4829739244071b381af28bb2f32284d73fb0e422
581576 F20101221_AAAQNV subramanian_v_Page_082.jp2
f5ac7b64c726db7ca91fad2aea4d3365
6fbcecc1786cddec7e23a24d535e846e8f8190f3
F20101221_AAAOZB subramanian_v_Page_023.tif
a6423187d847823237c1a254b53164a1
9cb42c37c50ad4defae77777add909e75edb5e7e
F20101221_AAAPDR subramanian_v_Page_224.tif
a8b2706ce3439169b911c751f76add7a
fc213770b59e35d2a96ae2db0bd6d54b7d8c620e
1567 F20101221_AAARKP subramanian_v_Page_137.txt
600b8216af135ed208c214f95eef2971
902d1d81285a23bd304f37941364886a173c06d4
22681 F20101221_AAAQAL subramanian_v_Page_135.QC.jpg
03e84725ed45380ad83c831c7fd225c5
a0b3ea8b1f948da3c1d0478f2018ff8ad1e1409e
626918 F20101221_AAAQNW subramanian_v_Page_083.jp2
6d9655524bf40f4a7f6381243b320228
0cdb70d615d44c746e78e4ae8d1dc9d8bcf4374b
F20101221_AAAOZC subramanian_v_Page_168.jp2
109bd84829e06e9adc39d337e2693730
f5982d99417a62cebb1d7e8f8a5dc48fc3a7f914
2010 F20101221_AAAPDS subramanian_v_Page_266.txt
8be8e7f33ad5ce178653392934ec6b92
d39328c500526dc6200d991ae278b81e686e555c
1911 F20101221_AAARKQ subramanian_v_Page_138.txt
78c8c60f0796d742c8d4d601e2026389
de8ba67504669627f51c3884b839b424387dcd68
F20101221_AAAQAM subramanian_v_Page_059.tif
aea538466fe9c169af1b790feab500b6
9460ba48cba9c92aed56d8b71624e37cf204b94d
1033383 F20101221_AAAQNX subramanian_v_Page_084.jp2
1be83120bae046a56474d4067da2a397
48261c6df0b87b101c603f8e3549e40ac8d18265
48653 F20101221_AAAOZD subramanian_v_Page_185.pro
459e6ec5af4e672598e04824d10b0e60
df67dc260cb4008378eb439471f3a63a7f54817f
F20101221_AAAPDT subramanian_v_Page_094.tif
aca8d2598c761427938c452515a11a13
b77924a52b7639ae2c34331e828e7849fab86987
1855 F20101221_AAARKR subramanian_v_Page_141.txt
bd59213ed53c110de8a49dc64d919883
f0e201efee2eb98ca065bc25f2249c0eb918994e
573245 F20101221_AAAQAN subramanian_v_Page_304.jp2
841ff083f812d40f39ee9420d5d9832e
ccd172d0bd330d4c744a9fd558b48accedaabce9
1051801 F20101221_AAAQNY subramanian_v_Page_085.jp2
5d2ceb6f7c62d634e0d3c128071e87da
52a05f8fe7a2112c9795557f47bff9f8602f4a32
6109 F20101221_AAAOZE subramanian_v_Page_057thm.jpg
fb0b6ed79ee3091887b35b03bf1ece5a
b1a4be6031588bf6d70212ae8de37f2fbc90b68a
1051939 F20101221_AAAPDU subramanian_v_Page_266.jp2
89028e4df6774dd22a530af1b8ef959f
85452821e84cf612b1358f41376889a5673ca144
935 F20101221_AAARKS subramanian_v_Page_143.txt
f30634c9275d89838fa8ecdeb554796d
98a685cb3d9bbc19dc5500500a75d83322eb4960
F20101221_AAAQAO subramanian_v_Page_032.jp2
11efe1eee75efe1f34312e681e3beca4
10d95c71083cd6003274e522deae95cef51bcf2e
812649 F20101221_AAAQNZ subramanian_v_Page_088.jp2
8fa78d94897d89e960f972ae1665f35e
d00b9bdf532e95f703c86576c167f82d0f060e97
F20101221_AAAOZF subramanian_v_Page_194.jp2
ace8db5dc6573aed8e792104f871cfa4
eccafb097a699cb0b91f7fdb4df271514212e1d3
F20101221_AAAPDV subramanian_v_Page_127.tif
e6551d97f0b7e5e93856bb0da380b5e3
a29d3f1c109518a4c1aed6174a99f291c80211d3
1269 F20101221_AAARKT subramanian_v_Page_144.txt
748b5c8250a8675d828cb18742127b4e
b8ee191b155663b7ac466b74b83dd5c127c63702
381387 F20101221_AAAQAP subramanian_v_Page_303.jp2
6fc5eeea6e3df7fdcfc6d0b8aa2bb88f
48bd22ad983b924217514573b13042ab0396a595
6552 F20101221_AAAOZG subramanian_v_Page_064thm.jpg
1c6c991e6ccfa836e2716508f412f6c1
bef24a7eea1288db661d2d2eff075c8c2e51e464
18476 F20101221_AAAPDW subramanian_v_Page_290.pro
7e47bf98633dbb16f2633d891a5c0448
d77dfee16dc9b8e7eed8a0ff36c4c8cf3da9a226
1584 F20101221_AAARKU subramanian_v_Page_145.txt
0b6aa92ea3f3207be4012e7bda9d9d72
e6094287f2906aff763e34b87fe16f1fac6a8d8c
7732 F20101221_AAAPWA subramanian_v_Page_013thm.jpg
228907d142f27a443524397fd664fb69
977b9e41e9255c2417212309e25d2acbb0174ac6
F20101221_AAAQAQ subramanian_v_Page_289.tif
baacb3166d0dacd09b7fe04e987f6094
e473a4a39505cc87eb6f7f4a773cd3946b7cd595
927073 F20101221_AAAOZH subramanian_v_Page_137.jp2
cd9c499b9dde28ce26356600ec5cfea9
95bc5ee914e94a85d56c199baf3a53aa775b72b1
571757 F20101221_AAAPDX subramanian_v_Page_160.jp2
ad15a1e1e6931add778761b153623b59
94506aad828acd75b01d7c494e45ada3a66e88b1
2115 F20101221_AAARKV subramanian_v_Page_146.txt
9e338e6fce3d88a8dd205089fd5bf95c
9aa91149ee949432eadc7b5eacdda3dfcd12768a
31310 F20101221_AAAPWB subramanian_v_Page_197.QC.jpg
b86c525f6ca5699693ff570e33856d00
173ce2ddba48ab930d3ef43757d1d54aac1c68bb
16957 F20101221_AAAQAR subramanian_v_Page_289.QC.jpg
af71c81587b95874b8e3ec4b7ada2b18
3c120fcf72ef0f891aa25cd852cdc5d367b13fc2
56971 F20101221_AAAOZI subramanian_v_Page_021.pro
1b45fc4c0ac6eabe4ec7e7e6c74f84d3
74de47d61a1fdad2cc01365c65e21bd666d7efdb
2287 F20101221_AAARKW subramanian_v_Page_147.txt
bef3be18cccee4b68788ee2853c91a67
25905da8ab60c13892b24c98134d479bc2a7046d
915 F20101221_AAAPWC subramanian_v_Page_273.txt
71fa11f726b7a980c7cf911a9d08b8d6
ab68f85c186365a9ec8f5e677300d65fd2da4161
F20101221_AAAQAS subramanian_v_Page_151.tif
9aaa1efe7f354ea3dce27aa2ef0da8d3
6c4dd4887d61bdfd1788989646b5f298ec02da52
F20101221_AAAOZJ subramanian_v_Page_049.tif
989468d169cbfb8ba0c630aeabe4fb33
f973602e2eb44f2c0267e5dfdb20e28445fcd37d
52558 F20101221_AAAPDY subramanian_v_Page_129.pro
8d413d0e5608a91b9a1ef0cdffcfb75e
017ef4f28c01e590e1a9b642c185d0c175af38e8
2100 F20101221_AAARKX subramanian_v_Page_148.txt
29b2983555822c040b994836a6b0d1f5
10130629f43e3895981bbb0a5f67951b8276c886
3111 F20101221_AAAPWD subramanian_v_Page_003.jpg
f9d132a4cffaac0707a285c32dadbc56
511941c0d19a7a4ba2c4db4eaa269ec725760f3c
F20101221_AAAQAT subramanian_v_Page_035.tif
95ad4fb238f8c9355808f78200e416d4
f10b8637ae4fe8dadde1bfcc448261d0bdde43c1
8734 F20101221_AAAOZK subramanian_v_Page_176thm.jpg
987ace9a8cec0bfea476e30bda2a09d8
60bd603e884b6b5b82f6fb1616667d195dbcdd37
548012 F20101221_AAAPDZ subramanian_v_Page_081.jp2
8fdd07e7aa1c793903b37e5c52694688
f978bbf2b53b3e844adaaaf07c0d19e8c9b08b97
2195 F20101221_AAARKY subramanian_v_Page_149.txt
3d15f238be0b20313ce843e398085571
2d5704b9f0b40f9caef1d0636f7f3d54c23023de
9069 F20101221_AAAPWE subramanian_v_Page_049thm.jpg
ab0850c822dce3be69017fc1082a7aa1
b67ae094de98c278b4345667e03161da5019f135
121873 F20101221_AAAOZL subramanian_v_Page_015.jpg
3cf182b156af4748d85e2ccde0c3ffae
96a1c83fa81ececec56fa2baee4544042aa1fbc2
1512 F20101221_AAARKZ subramanian_v_Page_151.txt
69a4b0ecc545af9c78ff38ba03898a60
ca40339d586ba4f44bff1ccb7baf92086e1cfc31
1637 F20101221_AAAPWF subramanian_v_Page_110.txt
35fed6eb0005ef9a83157a63ea65d329
12c451adea2380b62b09834675ad1f9a859c14f9
1051919 F20101221_AAAQAU subramanian_v_Page_116.jp2
38948579d2e7ab7dd987c9f057bc5b1f
968e235e50abd37edf53e828596e7d59f293b813
929320 F20101221_AAAOZM subramanian_v_Page_182.jp2
62a76b2d9a6818fae25fb108c65903be
9bd89e5dc343415cda3f78604597a8a1c052670d
4676 F20101221_AAAPWG subramanian_v_Page_270thm.jpg
0e85dc0606a4018d4f273798fa895272
d9c5d099a90462b841a41850c388e52661ee0c63
952009 F20101221_AAAQAV subramanian_v_Page_117.jp2
47dded8270c6f28c2c89658f82203328
2245ba8d716e8d0422d51b22eddf9735d85807e8
3403 F20101221_AAAOZN subramanian_v_Page_290thm.jpg
2fc63fd210c428dbcb676ae11c696fad
cceb58a5e0e00f85c0e88f7ba89176356fbd8fd1
F20101221_AAAPWH subramanian_v_Page_078.txt
6d6da2b149efc12bde4ca90096a6c762
e7b73963170b6eb6c0595e8fce87003e388760e0
33654 F20101221_AAAQAW subramanian_v_Page_112.QC.jpg
cc20ab079857527d5e214418028dbc7e
00d428fed4dc7fc79491fff0cf5940e7fd31558b
F20101221_AAAQTA subramanian_v_Page_313.jp2
d2ca4040c73318aaeba1c8aabd10a207
3de271f0808ae4cf071b23856f28bc3d2dd8bfc1
7795 F20101221_AAAOZO subramanian_v_Page_240thm.jpg
feb3c81eda5deb7fbd713ba4730dc99c
4f0b9f99e0edf33c1845c10082d60e465e5fafc5
22758 F20101221_AAAPWI subramanian_v_Page_098.pro
41910cbd945223b6392feffea2a1726b
eb0092e1d9e900c2ae5267253e1869d7fc2ed814
8960 F20101221_AAAQAX subramanian_v_Page_074thm.jpg
05219b9a1896e77b88c37c88b5c57f85
765b181373e8db97359ac9bd8c7fdc55da0b4052
F20101221_AAAQTB subramanian_v_Page_001.tif
f086251fa376fc7240ce939088b83e13
a32d7d27d3b588bc2cfc3370abded083d5540d42
23291 F20101221_AAAPWJ subramanian_v_Page_075.QC.jpg
a03a3a25b76d21d6b451bbec2f21f6dd
06fd0f8fae7240bdd6e87905dff1d650c62017c9
F20101221_AAAQAY subramanian_v_Page_232.tif
61397424e2f12dc28b3b146b8a1ff146
15f5140df1871764a618a62b5ff5252e6d29554d
F20101221_AAAQTC subramanian_v_Page_002.tif
bd78a42622291ce2efa81436adc87110
6e3517b7b407036846ab3f115434a1edd285a9e1
707854 F20101221_AAAOZP subramanian_v_Page_078.jp2
f6afbd8bf6372353d3d4a75a450c1afd
4ed6886592ae469a2ecc7fedee8f1638282a5159
229962 F20101221_AAAPWK subramanian_v_Page_001.jp2
69ee856c2a5c4192c1b1ea91dc174c10
8182dc745ef98322b1bf7c1324c3449726a31bde
1171 F20101221_AAAQAZ subramanian_v_Page_150.txt
13328c61f549e050cfc230bb6e6dcdfc
ec9ddca4979fecd52b50dcf521585c81ad98f704
F20101221_AAAQTD subramanian_v_Page_004.tif
caba52e72ff83985186057c585a7f483
5a4e337b9a69c7bff23d44c83344b96778bea32a
920919 F20101221_AAAOZQ subramanian_v_Page_131.jp2
4e9bae3ac7cab3924ba556202113ef0b
ad059acac27a25626cefa0da4eea61ca85f5f449
F20101221_AAAQTE subramanian_v_Page_010.tif
d0c5671c33962fe8ca8a05dc40d018db
302662ec563c3d62f7935ee7c97adbc6d51c0b1e
92540 F20101221_AAAOZR subramanian_v_Page_136.jpg
c0030128f9921bc5272fbac67117d985
b2d0642c1f63c065e80361721eec3b08b05cf45e
F20101221_AAAPWL subramanian_v_Page_253.jp2
6b83af4f0be20e67541a42811cad53d6
85d477fbb9374048eb4b7a4645462ee76cc4c142
F20101221_AAAQTF subramanian_v_Page_011.tif
d116912cc32d65a3dac6c5d47d93d68a
e673a26545f4218f140ddeafa50c8701abf9f4fe
9690 F20101221_AAAPJA subramanian_v_Page_314.QC.jpg
a31d23e5a03ab6abc323bc703dfbfee5
c39f8af158c3fc4ab7c8b64f54fb89e468a2ab5a
55432 F20101221_AAAOZS subramanian_v_Page_192.pro
601bcc5c09850ed717b14ece204248fc
fe4b79547884f3f408cd35819b24fbd1f562d813
26300 F20101221_AAAPWM subramanian_v_Page_180.jpg
d439568526c7d5956a399d3d0eafda4d
5de06060d657d0db24c551178cbc03a0a5a1bfd9
F20101221_AAAQTG subramanian_v_Page_012.tif
55c9758b716b93fd570d28c16913c3ab
340c9419074ec1d6ef743055eec9dfab2b8bfe98
1577 F20101221_AAAPJB subramanian_v_Page_032.txt
da122d818d9ef5059ea317477a2046a7
72a66859b01e1c511bd39ecd3989bfe1ede2d10f
33369 F20101221_AAAOZT subramanian_v_Page_034.QC.jpg
23b9e1e4b162398fef2cd010db4507c4
ec50d7e873cd36d77206022c8851b250e427a757
25303 F20101221_AAARQA subramanian_v_Page_154.QC.jpg
15bf5f7e6860a04d4011023f81534e1e
402697f25214e00ed88078d318615f30d01ee2a7
26997 F20101221_AAAPJC subramanian_v_Page_099.QC.jpg
ffb3c036004bf461e94ed00d22e24b10
0250b1da944c83624b0bd647bf5264a0b3c63e72
16670 F20101221_AAAOZU subramanian_v_Page_017.pro
39959fe5dd2f14f5be3f0b7c3dc39461
5a47e12b3b87719a9540f7cfd7f72104e3ef5220
F20101221_AAAPWN subramanian_v_Page_242.tif
796f8cfd9d37b28e3fb16a77d9ba5faa
41b3504cabe36ebdc36702a98c0b2ae276d918e9
36556 F20101221_AAARQB subramanian_v_Page_097.QC.jpg
1f669786ce77da0b41022e344171d5c8
34032b113462daabc036d056db3a605c2de37edb
F20101221_AAAQTH subramanian_v_Page_014.tif
dc502acda1d5e746ce8cd7d48010d135
e1255754d4538ca9689f627d097626e08aacfb2b
5779 F20101221_AAAPJD subramanian_v_Page_130thm.jpg
0b04247011aefd0e022b149f9493d03c
2027919ca8a9901cabbdb84979c78faef803df18
994822 F20101221_AAAOZV subramanian_v_Page_144.jp2
5e6f0689e7b173628a12d93c6b5c04eb
89234ac62bb41c44ca122b0e9e6c775daee93a42
F20101221_AAAPWO subramanian_v_Page_021.tif
10af2d27dfc60f083bb5f2b02d80edea
f22c379cb8d459121160ca4bc1bde8697fbbf4be
444 F20101221_AAARQC subramanian_v_Page_003thm.jpg
514261a511eade8afe8b6f1261181d75
0da8487f60164e846aa36fc3792251b998263c74
F20101221_AAAQTI subramanian_v_Page_016.tif
fad1dbfc3719c5e1372b6f6d5c0daf3b
76d4f10149e58c50b4f6321575c64668f804873b
29361 F20101221_AAAPJE subramanian_v_Page_150.pro
cc9d940ee0aa2498ebeaee7564949bf7
7bea88319d207473bacaebdec1ace17b82ec55af
105618 F20101221_AAAOZW subramanian_v_Page_054.jpg
f8f204a33378fc67d8c1f4cd7ee9298a
77d5f23dbf1c0a0a27c8f670277b3f1022cc1d1e
32352 F20101221_AAAPWP subramanian_v_Page_207.QC.jpg
53ee6777dce25b4f6db5d8598486669e
1b2ba8f2ec244bd42b521b4569ab96622c232e89
F20101221_AAAQTJ subramanian_v_Page_018.tif
353aacfdbbfe6c296b7f158e7be90844
9972ed6b2ad534a29215373c322335d74881a270
F20101221_AAAPJF subramanian_v_Page_181.tif
434b671797505cde525963131617d3a9
878b18bd0f9754b4a93e9bd662ee81d594b540f9
833210 F20101221_AAAOZX subramanian_v_Page_177.jp2
99e06ac423849176327a10371e8102f0
6e02c8be9bb21688e5965786761dbf52d2f75ed4
67209 F20101221_AAAPWQ subramanian_v_Page_222.jpg
1f1d4d6e2b2fdb91e5cf4e2decff5705
563c07b18b1438b2e5ac56a0ed5d88aeb750c556
7145 F20101221_AAARQD subramanian_v_Page_119thm.jpg
8585154b0478a9c7301eb36b95b72320
afbb7e3355762b47f75b40ba15ac6a8b99d9e953
F20101221_AAAQTK subramanian_v_Page_019.tif
4dcdf9fbbfd4b7712c5e68befd83a056
7d23a22a2dad2d561527964ce5d4d7f7d05fbe44
1032729 F20101221_AAAPJG subramanian_v_Page_039.jp2
3cc28965470633138423fd7caa26aff0
089477db56abe4cb6307de50003b96965f567aa4
1118 F20101221_AAAOZY subramanian_v_Page_088.txt
3d8ea03ddaa937405995ce777058e81f
3a49c43e09fbd177942f914689a382ab714f82ae
9081 F20101221_AAAPWR subramanian_v_Page_311thm.jpg
da7d8b78fe84f2735bb2980924cb0742
eb9ad7e608c76f019a832440f76bce0165cc10fd
32761 F20101221_AAARQE subramanian_v_Page_116.QC.jpg
8453ed7a80869c2894c473901b609e59
cf3105595559bfb6a509e946470a027fb0f5d148
F20101221_AAAQTL subramanian_v_Page_020.tif
5ebd1930df8b6212d17b65f068ff8309
a467990addbeb91a6b79ba85ee14299d064632fe
1560 F20101221_AAAPJH subramanian_v_Page_186.txt
06341101f1112b459e4b75aed397d7ba
3339abb3492ce4ea926a3a8c990cbe7f182dd757
4448 F20101221_AAAOZZ subramanian_v_Page_269thm.jpg
da247cdcda8f6bd7e3a167c086081512
ef9917766323886d92f44e87bd91cfa3dd2ce5e8
93061 F20101221_AAAQGA subramanian_v_Page_036.jpg
6cb5ee31ac456012f716f9338828ccbb
ccc816257d3fde4940b25e83f501e3d2e35680d3
46182 F20101221_AAAPWS subramanian_v_Page_010.pro
c825131e7d918f1d81f95bffb91b4338
1255253158059150809cf1387e7404cb75b71414
14873 F20101221_AAARQF subramanian_v_Page_294.QC.jpg
f7d318eaf3e8f5bdbb08f1b469cf19bf
b0c70cdc686e8a4553ac0894a3853d159ac2867b
F20101221_AAAQTM subramanian_v_Page_024.tif
7b7b0408c2ec545010e468832a914703
0fdcbe95fd54a881e6295fd453081623b005d963
7566 F20101221_AAAPJI subramanian_v_Page_182thm.jpg
2b4fd0bee6a4d75d8464fe512bea821a
8869cd3a6df223eca8a1c1c9c43bb14c3716bc4e
114262 F20101221_AAAQGB subramanian_v_Page_037.jpg
a559b0e9e23f3b972072efcbafed0656
207251fac708805df4fe51e0882e9d04130b962a
F20101221_AAAPWT subramanian_v_Page_237.jp2
2ce783a1ed86200e895a0bc4bfff2265
8199842b087c1b155a1a987738499a13c87104ae
6388 F20101221_AAARQG subramanian_v_Page_198thm.jpg
6362299a0f6086a1c47d38f3bc191bd0
7aa263906add8fa5250d5676615ffa33c41e662f
F20101221_AAAQTN subramanian_v_Page_025.tif
014330086cac2efa7140d44bb14a4bd4
4cab41126de6bc2c8b268723a076d5ba7e8e7cc7
54801 F20101221_AAAPJJ subramanian_v_Page_268.pro
d1918ab455ba03c375d7c2c15b745980
bdd0287c41f5fdddcac5479694ac6d5e4ee87dba
94361 F20101221_AAAQGC subramanian_v_Page_039.jpg
4c149515b1935bf8375a06a6c92a0818
a18cad4a52fd9e218b3f2815568cb42f3fc87130
F20101221_AAAPWU subramanian_v_Page_162.tif
e767bb982699470773800fe6896eed08
679f783dba2cd1e15c09659aa4f91f7fce439b63
7924 F20101221_AAARQH subramanian_v_Page_138thm.jpg
5b8a03c0dc75f4cddecb6c277b6b3e6c
55f3f80bced391e9e652fc85e1c8b6ddbb2ee51a
F20101221_AAAQTO subramanian_v_Page_028.tif
b225dfc1918a7d4511d9c4963c818f7c
b2386023e9805e8a124f73267dd5a69c5d1a7763
1287 F20101221_AAAPJK subramanian_v_Page_227.txt
0eb257b182727558acf066c415a211f2
6d85d1b06a7e5ac5daee92e8ceb61157a3bf1826
98142 F20101221_AAAQGD subramanian_v_Page_041.jpg
7b98e2a707895d5a1b2cf11ddf445350
4c9175b968ea5cf03f30c13f9bb9f5253b42aaec
30010 F20101221_AAAPWV subramanian_v_Page_052.QC.jpg
2d12ba7c7c23334831518a58fadfdd65
a99a1d69d1a1a52aa6b0b751721676dcf58396f4
29405 F20101221_AAARQI subramanian_v_Page_085.QC.jpg
213b537d2106e03c42db7194ca9c8ffc
fa6aaac9fc252f5a667a074e6b6eca679ae77e26
F20101221_AAAQTP subramanian_v_Page_030.tif
c74b3d4e5c53fb4176f1f45460b97a5f
92671ab85dd4ae8a61b0a9a8d086dcb846b20de7
41567 F20101221_AAAPJL subramanian_v_Page_111.pro
ce4affcb03febce894608dc4af890375
ed52a8433243b23e98d037d3be10e7b69d1dc864
105513 F20101221_AAAQGE subramanian_v_Page_043.jpg
c07a19a1ddaa9bc3ced5f999c467a2c6
c5e49519adc144cc8ade1d22c2078db1644fad3c
27805 F20101221_AAAPWW subramanian_v_Page_255.QC.jpg
31eb6fec3f12b48de5226b8fa23840ed
a056babbd3e0fd5e194c7209b85ed3b5cd8e84d4
33399 F20101221_AAARQJ subramanian_v_Page_016.QC.jpg
4af576a1f255849c59e2c04ef16b331a
ce2a2d88dc4a56dc4aa7f2861a9d2e52b1950dd8
F20101221_AAAQTQ subramanian_v_Page_032.tif
9383fb07388fe47def3cb8e555f95fb8
8d82c4808d15ee54e6ac14a9cf6d8d00573f61cb
25149 F20101221_AAAPJM subramanian_v_Page_294.pro
95a5a77ad1fc98fb5308fa3c27d2fbb4
8be2ab0796b1d6e669706fe372f8a4a8baaad3c2
99287 F20101221_AAAQGF subramanian_v_Page_044.jpg
da6e0832ded99e63a7d12e1b5e7b80c5
2f88dab06045579d4c3ad212e1e15e18338e2a52
F20101221_AAAPWX subramanian_v_Page_076.txt
bf021b5ae5bfa5e389871c2823399849
b5aa552278adb17a5b49e52ec3224a56dbd7db83
8511 F20101221_AAARQK subramanian_v_Page_145thm.jpg
548d4d561b1e943eeb9869ab3ad4e75a
ac1cefa05be32ad082d09bb2cf6d220953203493
F20101221_AAAQTR subramanian_v_Page_034.tif
089f97c589999445bf8c40886abc4007
63995e079906e7cb402498f8cb14b267e710354b
368259 F20101221_AAAPJN subramanian_v_Page_104.jp2
6f0c2c246210b5b95f22028f4e67bede
ea47b9c212a5f7fc0d95abf53953f0923867f16e
111450 F20101221_AAAQGG subramanian_v_Page_045.jpg
a1054d6d8f2b678bfe5cb4b5668faed0
367a999744d878586b719993c10a30162f767dc2
73889 F20101221_AAAPWY subramanian_v_Page_151.jpg
936c74ed4d2aafdd71fdb7fd80059e7b
ce0791c7b34ab6be65bff17cbef886ad84c50a3f
16441 F20101221_AAARQL subramanian_v_Page_282.QC.jpg
10b33defb77623960c4fae377a01d42b
ed960b641fb5964d1637535a536ff89cc6586018
39659 F20101221_AAARDA subramanian_v_Page_139.pro
72d8785c3a522ccdc173bff480e86a2c
c59fd4474c4dc09401fb8e129c9f3d4f1db8eaa6
F20101221_AAAQTS subramanian_v_Page_037.tif
7715e462a306729eec053f20165fb21f
37a3eaa37830cf6acc7ec06f0dc51af168df6625
46863 F20101221_AAAPJO subramanian_v_Page_092.jpg
56c6ca668efb3fdc353d282d70b0a678
31873f903bba33b59403cdf94ea12b52457215f9
109146 F20101221_AAAQGH subramanian_v_Page_046.jpg
b931b1ea2cebafc22c7a1d1c03a848a3
625aaecf5a22f2da5eb820c43e7f4929a4fd2a5e
23240 F20101221_AAAPWZ subramanian_v_Page_132.QC.jpg
5b24f817dabb4985bfed7b6f6a61f2b8
c95b4f90260a6bdf357c7b2194132f941f768675
8087 F20101221_AAARQM subramanian_v_Page_113thm.jpg
c96a09e46974f2b1787e25b7d7ec0392
f5b7df5791c7e16feca2a2b967085c4c99c3cedb
45814 F20101221_AAARDB subramanian_v_Page_141.pro
9809bc619566a746b61bada4b8ae7e5a
3358bcfdb1c9261daf27c5c3123098e4b85c72c3
F20101221_AAAQTT subramanian_v_Page_038.tif
98fce34359018a1de38e462dbcbb7b17
b2f423ec04838db05db247159f90d5cc7e31740d
1745 F20101221_AAAPJP subramanian_v_Page_257.txt
22e2a8150bc2d336178b04200a2e6b59
2835b59a69d2da8acb24170e7ff5644cc25afd7c
111952 F20101221_AAAQGI subramanian_v_Page_047.jpg
faaa3c2641355da4ff0f80c2cc577f29
b0f708754f5fee0d73f8ebeaa0537db308bd00a1
28666 F20101221_AAARQN subramanian_v_Page_144.QC.jpg
7e778869bbda6c1b30997759a6723abb
005fa844d6dd7420b668de798759a848d355c6d8
21726 F20101221_AAARDC subramanian_v_Page_143.pro
20bf7954a834be0e3cfab1d121838425
b9230856e8a126313e3323b0fa7db3bde19fd7b8
F20101221_AAAQTU subramanian_v_Page_039.tif
710ff7c11ef3f48e0e30a0c25136f4f3
b75660056f04fe44c1f88a2b3ba88d4861aae509
F20101221_AAAPJQ subramanian_v_Page_186.tif
d40828739bfa4eacb5262c37435a5009
4b4f2598c10e3b55e44a782f9edd1ecaedf4165c
73626 F20101221_AAAQGJ subramanian_v_Page_048.jpg
0734c3b6d851d46080dc6cbab3b6b3c9
5c0152562ef2db1d12f0c50dd9851545faea16d2
7996 F20101221_AAARQO subramanian_v_Page_067thm.jpg
484617f19b264d56ab32ba48a2f634b8
ccf11b25b5e44c34f4e3b7008dbf795c1c893c79
28893 F20101221_AAARDD subramanian_v_Page_144.pro
f1983935e9515a5947b1d4334847d1e8
2f24522cfe72d1b391a349f080fa338c75ad49df
F20101221_AAAQTV subramanian_v_Page_040.tif
c1c67fed348ac4761c377419704480f9
95c78e80e68105f8b64319a0a5b6ac5d90e0cc0d
F20101221_AAAPJR subramanian_v_Page_042.tif
9ee6bd31e8c4eb5a79d786d069aebb32
60d65c583c2a000d1a604d2ca4ea71fa3b9f385d
116853 F20101221_AAAQGK subramanian_v_Page_049.jpg
aeec6ba56111ec8433aadffdab478485
ed770690d11582d596c8c2d75969895c5fc2a16d
29273 F20101221_AAARQP subramanian_v_Page_095.QC.jpg
31a60f7e3e41087d83cae1ec7080b139
2c934d1d6b3125973987f7afa5de0eca830cf5f9
39738 F20101221_AAARDE subramanian_v_Page_145.pro
6a57f6a625322a833a1ab78fa97f7d73
cea663c627e067f0cfeac8a772ad13ae63315d92
F20101221_AAAQTW subramanian_v_Page_041.tif
9e04ef38349f2e3f9d95d4a0e3b67068
3fdd71f0cde9f0fac6ddd82d4d8c1c0849b92cef
30567 F20101221_AAAPJS subramanian_v_Page_176.QC.jpg
c7204b12a02dc2247d9256abd724cba6
7c4f7ff2cec41c181f183b5a95f582c47a28ca71
98523 F20101221_AAAQGL subramanian_v_Page_055.jpg
9ee3032737273345877ce3ab8b4f0c1c
037fe9c4f9916bad4c9bb70163f710e27c89e959
8434 F20101221_AAARQQ subramanian_v_Page_051thm.jpg
687767f6d329f012b17c20a32c8f7a9f
6cb734b6c8929a1cecea062f5c587d530ffc944a
57905 F20101221_AAARDF subramanian_v_Page_147.pro
529c6604d326d90ac334ef2c7ede61ea
8b266f5e229997d14121475d79904069287ee715
F20101221_AAAQTX subramanian_v_Page_043.tif
9dc8fcab97a5bb4220d1fcfe2d36cda3
d45ad492e0aed2e6a6c8e3c64512ae06b5c235b6
F20101221_AAAPJT subramanian_v_Page_268.tif
2e87d6bc2e51c02b229afdf99d9b7625
38967019db8a0118fe400f1e792b3e16d5f5a131
59910 F20101221_AAAQGM subramanian_v_Page_057.jpg
1b2386f39caa50f80341124fc063126e
3e876f9009aa2a52b1c55aa485c32ccdddd874c2
15572 F20101221_AAARQR subramanian_v_Page_142.QC.jpg
1c3664fd9a7402601ac0a002d4665362
73aee77de386e49f6fa70e35541650b9508a1ffc
53206 F20101221_AAARDG subramanian_v_Page_148.pro
c41111b88b12723125ac84461d3cc3dd
a3a5d04930b542e3fa02afd410796fe3378a4661
F20101221_AAAQTY subramanian_v_Page_044.tif
c9602705e7343b74ab21f889e2df49f2
17866853a3c151d9355862c1bcd5506dd1ce0372
1028903 F20101221_AAAPJU subramanian_v_Page_055.jp2
87a9e15b081568ac170bb785a1a303da
d53dea456cf23f261c6dc2396e85d071144df312
62563 F20101221_AAAQGN subramanian_v_Page_058.jpg
5711a7874a91f2eff6c09f702907f7c6
b95c4105e0af23d9606ee6796157cd289308d366
8098 F20101221_AAASAA subramanian_v_Page_227thm.jpg
a1242e7635ac76fe8318787aa261fd1d
94f8feb123a7b7ae69d29bd8c0276f3737cd6526
8436 F20101221_AAARQS subramanian_v_Page_229thm.jpg
17fc94f2c352e6aff1dabe58582bad96
39beb95a2c15a375c71d848fcacd8bb0d3760a03
56080 F20101221_AAARDH subramanian_v_Page_149.pro
8f2f31ed88edba16fc7234ea6c1ace75
0a63e490b14a44496b5f7524af858a619751d2b6
F20101221_AAAQTZ subramanian_v_Page_045.tif
03f329dd40cf2884b5d6b1ecefd3799e
6df27d4c418d68442b4f99b2a6f3b56f3248fc14
F20101221_AAAPJV subramanian_v_Page_123.tif
ee48d6f748753375b66c02f97c00fbb1
7d50f47565382c9b1fef24a03c86c2ff192ad786
84755 F20101221_AAAQGO subramanian_v_Page_060.jpg
4b3f483c7a409f3d59b7b6ff683fdfd4
151849c6c4468fa9f9141d6f3cbffb7344eca1ea
32883 F20101221_AAASAB subramanian_v_Page_229.QC.jpg
49de3042168f17640aca320ef6593a3b
5ff76f871d4779144bccddb791cc83fd1ccad09f
21748 F20101221_AAARQT subramanian_v_Page_251.QC.jpg
305d5555147c284b5d9f10f9ca5d534a
b56be49da5463f1ddeb52ddfa6f25c87521f4f88
36310 F20101221_AAARDI subramanian_v_Page_151.pro
f552295272c0f4afeedfa5e8d3b4335e
a0b1e4409d4e1f7dfd472bf1c2bda805ef61ea02
28361 F20101221_AAAPJW subramanian_v_Page_208.QC.jpg
b5e5ab1b7b7cea1239b47c0fa97027fd
7510de3a80f4fe96cde73a28bb367ea14ddc6e72
82405 F20101221_AAAQGP subramanian_v_Page_061.jpg
32f35d59f21d9fa2cc7745a599cad6ca
3424206080e7e28fba24c87aa3e1b91246cfe34f
6382 F20101221_AAASAC subramanian_v_Page_231thm.jpg
be8f9322572648b96d8f0ed4a8e5c1d9
8d3e051dbb5319b021adbb8d79cd49b40e4b22b2
14208 F20101221_AAARQU subramanian_v_Page_301.QC.jpg
14a82eeccd0325fbd85d7d77d20931b1
64d39a8fa095aa7e9c3f484e9f3f19e0fdbd34b7
49568 F20101221_AAARDJ subramanian_v_Page_152.pro
9ea50c9802760e6ec2e26a9706e2cf82
4b2604d6f71b7306337fe28d2d8aff3cd20a0797
30947 F20101221_AAAPJX subramanian_v_Page_235.QC.jpg
59d4551afb5fa37a87aa8a4cd1fe1dca
89ceb8723c9fa42bbbb65384b187296b1fc28708
97625 F20101221_AAAQGQ subramanian_v_Page_067.jpg
b0868c4977ce23fc0aca6ca14b671b5f
0d9f8e58e8e1f748ba81216c1b2807b53b8f687b
5274 F20101221_AAASAD subramanian_v_Page_232thm.jpg
a9a00f28c3574fb270f4f95eea686789
6a557eed2792aaf8e6d3cf48561d9ed900a4b086
7488 F20101221_AAARQV subramanian_v_Page_215thm.jpg
762cb0551bcf5db81cf179c22f621c11
61b80693bed3975cc21aac4e8bc1092d9118929d
34754 F20101221_AAARDK subramanian_v_Page_154.pro
23f7f618ecbb752befada545a8912316
5df751af75bb46edaff08b0ab20cb5115f41b81d
93790 F20101221_AAAPJY subramanian_v_Page_126.jpg
f17bea96db8160695cb596e604ebda48
68e2251ce378d6da86c31c01f7ea071fe00abe5b
72193 F20101221_AAAQGR subramanian_v_Page_068.jpg
ef4e6856004c4192e52cd741d280389d
76198fcf69b926d062db31c9b9f200da2dc2ea92
27864 F20101221_AAASAE subramanian_v_Page_233.QC.jpg
bef7b9239c38f3e400e8b2a0e9a8215c
3414c250fe42d2c18e2bd71acb5a8b5cb80f17f7
36484 F20101221_AAARQW subramanian_v_Page_148.QC.jpg
a99ded071ede106c88c15bdf0841af95
a6701594a36d41875b34d2d4180a72dbf21d0cdb
25186 F20101221_AAARDL subramanian_v_Page_155.pro
1f615cf1629e892521693f3eb87b10a6
70ac485c48bdfcb4a8df36db0392201d1b3f9240
7896 F20101221_AAAPJZ subramanian_v_Page_069thm.jpg
86b4c5cb6e4a970b854030699249e36e
aaa8d3a4268ccf93d4d10e68fdcb4075aee6fb3a
105557 F20101221_AAAQGS subramanian_v_Page_071.jpg
d276f4153acb188b3757e8b2ad9da4be
c357d39aa28e6964287d84cfe6a56c0d07a94a7b
30393 F20101221_AAASAF subramanian_v_Page_234.QC.jpg
eda160f1e6ccb102b490f341e89b386e
ec5df1a48ef069e4e32ef927eddda057dbed3b31
34012 F20101221_AAARQX subramanian_v_Page_094.QC.jpg
5509f653957ba26e3596ad63f953f0d5
a5d9bdb2577eb92eafeb004d8a805b35301cbcb1
38272 F20101221_AAARDM subramanian_v_Page_157.pro
76d62babb4e8cc250f4aaa7dea144c6f
68c82a4965996a57f578b4e2cb15f044c68170f0
89081 F20101221_AAAQGT subramanian_v_Page_072.jpg
e18d92a4d506b58dcb8dba8189cefc88
2b3ae4992c93a9ca6ccb94b8e8d80efd1e9ae9cb
34613 F20101221_AAASAG subramanian_v_Page_236.QC.jpg
f375793017e359cda291a5c157edc128
51effc1c2f920df567f417ecb91493f61fc04526
34228 F20101221_AAARQY subramanian_v_Page_023.QC.jpg
bb51e202802f7e040c94d79c33f4f1b6
a0b64e41f874072af0a346f124d0358bd0e0bede
1797 F20101221_AAARDN subramanian_v_Page_158.pro
cdea9790fed9a41af018715f556f6ed6
e3885a6b36657ed32d1230ff82be095f17366273
89463 F20101221_AAAQGU subramanian_v_Page_073.jpg
872422c7d0031a0fc100885db1dd8513
1a96ff1df16aecbb816fc52dfe47d453a4a5fd49
35601 F20101221_AAASAH subramanian_v_Page_237.QC.jpg
ac161e26aaf8c2f66a28ac1b239baed1
552edaa1faaf155767d081a227e5f6b895a66a7b
7950 F20101221_AAARQZ subramanian_v_Page_260thm.jpg
4a69ec2e40ef68e0032518220d7bfdf4
0d97c8a49d91c28ef5e819d03694cc48773015f7
21442 F20101221_AAARDO subramanian_v_Page_160.pro
e9691425c849e4c8e7de9429fe70fa56
2a0cf31d2a98d87478086a3717d88d8e94f9ddf6
112827 F20101221_AAAQGV subramanian_v_Page_074.jpg
2957b3d7d90789c1e27afedbee2cb652
3fce3cebd7e82bb0daec22a86b012b79bf91b5a4
32300 F20101221_AAASAI subramanian_v_Page_238.QC.jpg
6208b7e655e035da57fec5cf6bab5ff0
d33646203219635570c312f8dde099d10065fc51
59379 F20101221_AAARDP subramanian_v_Page_163.pro
8813173090c659bd9f1025735744eb5a
975b2d8511cc712698fcea116d374775aa0a7b8f
66090 F20101221_AAAQGW subramanian_v_Page_075.jpg
9ace589a2054ab889f13098571d8718a
f04a3054c7f2288470691612352f2b530c95b3de
8196 F20101221_AAASAJ subramanian_v_Page_238thm.jpg
9e3d12a0029043cd5167884c7badc5a5
42d23734ed31929d65b4a15a926415b21cc0cd0e
F20101221_AAAQZA subramanian_v_Page_265.tif
8649f71b1b5e0c8ddb9659f7fd54a52a
80c4b71d06361d84464aebc967166e998d2a5995
63001 F20101221_AAARDQ subramanian_v_Page_164.pro
cc3788cf52cb7f9cb4fb34676725db3c
68161c099c32a7742ba1af3213823e82a544d8a2
63609 F20101221_AAAQGX subramanian_v_Page_076.jpg
a4b341e11ffc134b3d48a5b4f0579052
ba27c4df1337da0080de0b6644bce587db706ae3
30747 F20101221_AAASAK subramanian_v_Page_239.QC.jpg
52ce94875b28ffa473ab5b321a0f4914
6baa51286151531f753598621ee3428d06e95fd9
F20101221_AAAQZB subramanian_v_Page_267.tif
820fab42708c4feae59002f81ef5341a
27b303af248824ccf33ad29522ddc0285be3cf4f
61442 F20101221_AAARDR subramanian_v_Page_165.pro
2f06a9de500d43cc28abd888d596ee5a
f83ab93400efde1b5e25bf58fb0a317a0e15bbbd
48175 F20101221_AAAQGY subramanian_v_Page_079.jpg
72cce6b0ff60c8146f581c5c53a11326
5b64e8ec04e81699b8bb854e2567ea68be677904
7931 F20101221_AAASAL subramanian_v_Page_239thm.jpg
c5009ed6805cbaab1b18efcb8873cc60
84dcef47814cdab1267ae454197e098f6348965f
F20101221_AAAQZC subramanian_v_Page_269.tif
8dfac738b49191ae2ac99a9ec203906a
db97bc8849f95bab8bd3d08038114ec8b6d0bd55
51543 F20101221_AAARDS subramanian_v_Page_167.pro
2e2c6dbd9a0662ba00933129e9507bfc
0a9e1865aa9599b310a7366c9229118d787e5844
57438 F20101221_AAAQGZ subramanian_v_Page_082.jpg
343645c21788a57f9779c704a67d2fe1
feaa0bbc710095fb60e689244964fbe2af64d16b
31442 F20101221_AAASAM subramanian_v_Page_240.QC.jpg
e11330d2b052d79729179798eefd33dc
f153e0cd0570b9470b88d7677bad62d2f0eab25e
F20101221_AAAQZD subramanian_v_Page_271.tif
b2a4670802c41bcf80b516e568b37da9
2d45e9cb6a61963b39bc629499dd4d8d74d99426
69197 F20101221_AAARDT subramanian_v_Page_168.pro
94cf3fcae4c6783c6004df9e72483547
6e7cc9c12afcf7ab6da0226c4c35cf4d8dc2cca7
31264 F20101221_AAASAN subramanian_v_Page_241.QC.jpg
fbfdb4a843192a629fd302124be64417
987ab83c81c1b09be50c4cc59a4446747ac159c3
F20101221_AAAQZE subramanian_v_Page_272.tif
4c263ceeda535286cd2df0f920f3a857
309ffeddb7c299ec31f4cbbab4d111ecf0038c24
53022 F20101221_AAARDU subramanian_v_Page_169.pro
de36e7e0ee5811f4a39e7837703fea8e
daa58dbffd10791af22af6bf6c98098bfa82a8b6
45396 F20101221_AAAPPA subramanian_v_Page_026.pro
321ac042fc598982db06b3e6d3532212
74199bc15ac30f829b023d1a33a371fc14e428bc
7892 F20101221_AAASAO subramanian_v_Page_241thm.jpg
d9fa46ecef572c07c401e992d4dcbd15
ca92c69791e50a4f0347f4f2ac91ea918890459d
F20101221_AAAQZF subramanian_v_Page_274.tif
17601f933cf886a6e249ea7dfc5c53bf
506aa0de63d8894d53a14d62337f205b61bb12ca
57437 F20101221_AAARDV subramanian_v_Page_170.pro
9c44e13f63484e23fd4ac1af02746092
2ee359a538c2a7e654860ad87654210fa4326aba
F20101221_AAAPPB subramanian_v_Page_029.tif
7d18156bf80e09ce533363f26a5d0b99
61917e7b51ad9fd009bc7dc185ae1cf4f7c8e8a6
29394 F20101221_AAASAP subramanian_v_Page_243.QC.jpg
9e04478eed97f95a7640f05beeb41990
a14e138d75a0d5c6649b53adc36fbd393a51001d
F20101221_AAAQZG subramanian_v_Page_275.tif
1c11ef01f25613936621500c6b0cf084
aa7ce2fd3059e579d394519060af833c905c2a55
6266 F20101221_AAAPPC subramanian_v_Page_214thm.jpg
14280cbb1738e1d5b39dd17c3489efff
562963c71b699105bdebf6204f8a5ad4fb35f53d
32483 F20101221_AAASAQ subramanian_v_Page_244.QC.jpg
3a8eba0cefee30d1be822f3485d835cb
fccbb9639f593a05187b6677efe6c1abb1327b37
31843 F20101221_AAARWA subramanian_v_Page_113.QC.jpg
8d43f798bc9773523c1f2d7d8a879eef
60408c5297ce32bc06fff2ad0983f1f3efe0217e
F20101221_AAAQZH subramanian_v_Page_276.tif
843e88a030a9ea395d5ce3af0c7f0d12
cc79b850fb32f11cb78311dc1f54c1d45b7ea5f0
46321 F20101221_AAARDW subramanian_v_Page_171.pro
b2176908d3da13f322f0b7008477fdaa
145848de82e1da218776446eff135fd47ae0e13e
8828 F20101221_AAAPPD subramanian_v_Page_054thm.jpg
43da9e6cd916bf2786574f7268a53c2e
3a9d1674046a75b59337417b39ad78aaa1f15f3e
8158 F20101221_AAASAR subramanian_v_Page_244thm.jpg
6bf132d58fba743f63403a29ce1bb101
1fa4b11b630b17c59c0975eddfa5eb4d33724646
8710 F20101221_AAARWB subramanian_v_Page_114thm.jpg
20ef17bab932042a18e694667a886e11
6e1583c6c4f8d13afd74922210e034dad8741f1f
F20101221_AAAQZI subramanian_v_Page_278.tif
ca5796d415aa9d20bec28feb8aafe628
6526ffc2952af9615b6e360d92e3bb70e5478163
51388 F20101221_AAARDX subramanian_v_Page_173.pro
1f943af706b7c28cea7d7a46497ec8f1
d263fa0cabb3c7b30f3a215d56111ca7eadfa5f0
9226 F20101221_AAARWC subramanian_v_Page_115thm.jpg
fa028d32ebb12a1743c2e5ebb8937933
336098720f74989805b7a4cbe8a113eb68bdd57a
F20101221_AAAQZJ subramanian_v_Page_279.tif
777721be119c257a334f5a1ac319b409
54cc5e43cf9e53f9fafa674f718730ae9ec3c867
45247 F20101221_AAARDY subramanian_v_Page_174.pro
2f2a7c5240d42f4807719db0f6206c67
4b5b89d91465ed48adf07d5577969722ad41bec3
50706 F20101221_AAAPPE subramanian_v_Page_253.pro
0e3ad2e4a72bb6b0444808abb00e6200
b455a26541d18efa877ce1594390ec8e02e5ae5b
9012 F20101221_AAASAS subramanian_v_Page_245thm.jpg
23d03937957723828d083e338f7b8964
98ad5d7f2bf0a8c26a2f2f88beb44591d25b4ab8
7659 F20101221_AAARWD subramanian_v_Page_117thm.jpg
4cbd1a41acf1800841e42c28698b6362
a239286445d942ae736be3399f2fb954f1d265ff
F20101221_AAAQZK subramanian_v_Page_280.tif
1237d2f03f8e73c0b6c8120b301304f2
eac22fa8bf2882f51371ff0402a7c4942f98960b
28458 F20101221_AAARDZ subramanian_v_Page_175.pro
c96bc9a91b704c4046837703bab76be3
7c339616d12453ec2167e1c8f0e4c495fad331df
1441 F20101221_AAAPPF subramanian_v_Page_153.txt
7ad92e626b77c50c987822fbbc51ed74
563ec790efdad2d67aee077161f56f41784f0783
30245 F20101221_AAASAT subramanian_v_Page_246.QC.jpg
41e0959839b77bb6bc89b1e33b54a5bb
53fae4a261941c6f91ab9b60673fa0a9c1ba8402
27566 F20101221_AAARWE subramanian_v_Page_119.QC.jpg
1a3bcf7e536aa51a3d97e7bab954eb88
053687191690a70167a7edadac64d15333c2e02b
F20101221_AAAQZL subramanian_v_Page_281.tif
d4b4588c797c76e777985ff2a697f151
6b186277797b1a794d2fd7bc66fc856d00fc381d
101495 F20101221_AAAPPG subramanian_v_Page_267.jpg
6b51dc28caa26675ce8531344d33602d
a6b1bc08ade2998a06d3b8c213c3b30d32af6e60
33905 F20101221_AAASAU subramanian_v_Page_247.QC.jpg
50ab02600d15f0d47c63b3e6e3586abc
185e9e1e7e0dc56253259fc7319039e15d6904da
8036 F20101221_AAARWF subramanian_v_Page_120thm.jpg
fc7b31b24eda3721c7d56d21fe042e8c
90eba690daad73af06c3b3cb22db5f0e5704684b
F20101221_AAAQZM subramanian_v_Page_282.tif
7486093fc28a56893919e8f4b9081cf8
8cf419a2f3f5cb5d7f23850a2f32743b199306ba
18164 F20101221_AAAPPH subramanian_v_Page_284.QC.jpg
da38f132b83b686844bc74186f3c2f40
3695e7ab5ef1c64064d176b2b84e5fe1302432ef
30192 F20101221_AAAQMA subramanian_v_Page_314.jpg
680a69cf8f2a2d80b8c8b600ab67def2
1d96a91290241bd019032f9d21ab724e077fa5c6
8480 F20101221_AAASAV subramanian_v_Page_247thm.jpg
f547c3e9ad4a5c93f167a57c24dad19b
115902a3f8e47562b94a2d95adc3c9716d71c51a
27891 F20101221_AAARWG subramanian_v_Page_121.QC.jpg
9ab49a61806f201c712a959dc9443059
57d05b425256e37198f63e982921ed2f2844d47f
2735 F20101221_AAAPPI subramanian_v_Page_168.txt
447889d4edb0f809855b92b8217b59d0
e1ac641536ba5c741f3f523001e81b5ae88074a0
28633 F20101221_AAAQMB subramanian_v_Page_002.jp2
caf5f3ed9a49756fe8cabb2d3492b6ae
bc53b92ad3618194a5682c8ee81b5d6a37629154
F20101221_AAASAW subramanian_v_Page_248thm.jpg
cd30f55b7d7beae87d1a34d02ec7c424
42dcba5a12fd139c5e6bd82b3f5bc9eaddfd3469
35839 F20101221_AAARWH subramanian_v_Page_122.QC.jpg
3b9c276811a870136121be1687456773
0a416475a5e41934d23fd4dd522217680363bffe
F20101221_AAAQZN subramanian_v_Page_283.tif
7d52584200c0431cd63a4296548d3e3c
2b1302586eb2bac8bcf3d79a98057cfac3c56f34
8064 F20101221_AAAPPJ subramanian_v_Page_267thm.jpg
6d5b88d66be87a5df48ab6a70c4ed89a
37ba1a2121561c6ef37f6649d37e1f653f836a5e
17241 F20101221_AAAQMC subramanian_v_Page_003.jp2
7c3eb3a5b0fdcdbc51cf0d23e66205ab
515a9d9a2c460479e4a7e5a804a3bed8254d27ca
19811 F20101221_AAASAX subramanian_v_Page_249.QC.jpg
c59ca2e58e617c5f3087f0ff958d1265
ca073010d383fd6fd6b53dcaa8c27499d8b8f20c
28351 F20101221_AAARWI subramanian_v_Page_125.QC.jpg
7923611bc814f25bb6891721da9d0762
d4468816cf24a144f19a83c0a9567a64480aa370
F20101221_AAAQZO subramanian_v_Page_284.tif
e3a39f224c39aec00aec54fcde5a059a
dc8ef54b61f4666b2ce5e382b8e9868954c89762
F20101221_AAAPPK subramanian_v_Page_120.tif
2ce66aceea0b37eae1bafe11b18d0b7c
b2c31460ae2c83948c3319188188983bf36299c9
F20101221_AAAQMD subramanian_v_Page_005.jp2
6ff20f54d0d8ae5788097c6b4970e3c5
9b31731782cf68e7ec86255e12469a806e806722
5353 F20101221_AAASAY subramanian_v_Page_249thm.jpg
c2f78c807525df0bad298ad3d2020942
f4956dfdcafb782a01734e5d36f6b3bd35a1a46d
454 F20101221_AAAPPL subramanian_v_Page_256.txt
52b16f80796fad46649c34f8d700f53f
38446afab3a482959dc45cc981e4fc10ac065955
F20101221_AAAQME subramanian_v_Page_006.jp2
b5e7661c5ea265c5916da8a9f413b15f
020d60d5f1194cb44b8cc68cd3cba9d829c3611d
F20101221_AAAPCA subramanian_v_Page_008.tif
c38beccfc558ffc8621a8a1f472b1c2e
471a176cb2b2e1871f640c02b64cdc201a26261e
F20101221_AAAQZP subramanian_v_Page_285.tif
e44a57c573e4adb2df43078bbfdb6aa8
431d57cc31533254be6f56eb57cf582ed741e791
32757 F20101221_AAASAZ subramanian_v_Page_250.QC.jpg
78a64bf8656a29237a7331e0ec6b2edc
3a4b2116ee935d7689ae6a25cb3d5be9c6d91b6c
7940 F20101221_AAARWJ subramanian_v_Page_126thm.jpg
06b241af5276788469a731a6d9e493b1
05af37cb32084e1eee17e6c538ec34bb7b652206
2314 F20101221_AAAPPM subramanian_v_Page_261.txt
dd031d35a659a3d5b054cf4bedfa9e74
880477846be90971c69848bf4d15fefeb09ea331
757677 F20101221_AAAQMF subramanian_v_Page_009.jp2
acda74d330535a3737ccdd10855ab4ec
dac55315a4ef0a4e07dd91aea6824ddf286fa5c0
42744 F20101221_AAAPCB subramanian_v_Page_290.jpg
2b422e5e781941c1c742f9e2257c92b1
55f1515a3c73fa97444dbaf691fa4da3b939d601
F20101221_AAAQZQ subramanian_v_Page_286.tif
b68dad82297953c16e957a1f6ed44a2e
bc8e893a14665d885edf0cbaa608fe73141b759e
32346 F20101221_AAARWK subramanian_v_Page_127.QC.jpg
6221b5e321285b99dbea971ab4413bc9
7e3e987e134a4775c18d4ab74dcc6e54abe7bdbe
120587 F20101221_AAAPPN subramanian_v_Page_165.jpg
125c04e05d09c445ec8b039088054df5
a19e2539e3a03ce444ae1faed0022389b46323ba
1051980 F20101221_AAAQMG subramanian_v_Page_010.jp2
b2af119e21648e04a2daa88e72d8d435
83883bd064499aa8e033ebc090e40959903a136f
29981 F20101221_AAAPCC subramanian_v_Page_206.QC.jpg
6e8625a4cc24b2c8133adde491a67b7d
c56c5344439b8cae5318041f59d969fe0517beb7
F20101221_AAAQZR subramanian_v_Page_287.tif
3311eb2f3ca017ecc4bfb390d1581dc0
c9ce4cbaa7bb7ef932121d73863449a161a2f231
37022 F20101221_AAARWL subramanian_v_Page_128.QC.jpg
a857feeefaf02e4f5bf77c455ed0667d
50871214018d57b849ddcb4ae07bcfaadf5ff02f
38091 F20101221_AAAPPO subramanian_v_Page_110.pro
c784a84e2bf5868d625abc166520c7a0
172d35407d448f2ca432cd1bdb964ac326de29cb
1044107 F20101221_AAAQMH subramanian_v_Page_018.jp2
51f3e421d0537db321d470307b93ea4f
a1d72d4ea424a5c8162b78ee8a196e4cc0e7d928
2196 F20101221_AAARJA subramanian_v_Page_074.txt
17e77bc6208317346ba5d99dcdcc19b7
a82cc2257b820246b0717acb5cc8e89bbdd8deb9
F20101221_AAAPCD subramanian_v_Page_160.tif
68f1fc9ee8ee813252554effdaca307a
8e738aeb80f595719eed8c06bd2ba112a674b45a
F20101221_AAAQZS subramanian_v_Page_290.tif
398fd5c32b183c8e61ac405119055ff4
e21cd90a4396a2a69ad4d7cf81dc6cf741cb713e
8920 F20101221_AAARWM subramanian_v_Page_128thm.jpg
9fb1f60eedfd896802111a49564864aa
94e3487e6a4529da827b19e84d90817733890eb9
795096 F20101221_AAAPPP subramanian_v_Page_119.jp2
0e682cddb59816b0433e69a8dbe32cfe
8fcd063e4e8e29f6573f645e94c578a21358d7d4
704692 F20101221_AAAQMI subramanian_v_Page_019.jp2
a9b2cdb0e918f336d21b3f6cc0e07434
67db90c40dbdc141861a545abfa4725f6ab1303e
1062 F20101221_AAARJB subramanian_v_Page_075.txt
5212e4503f7b904e4605c848a2b219b9
75ad200ab7f4ce026451329812ccfadebda7719a
F20101221_AAAPCE subramanian_v_Page_210.jp2
208ef9677bc8679fc77cce81d93b4eca
25c603868d7a4255b28243fcd1c150700be523bc
F20101221_AAAQZT subramanian_v_Page_291.tif
d9f677b544d62b6e61717bc2c9e7537c
79ed20ab1d28fd95eb620fbca3daed181f62a35f
33787 F20101221_AAARWN subramanian_v_Page_129.QC.jpg
cabf44ba94bbc996a393210ec49e8013
352dd43f0de6e7c2c52bc6c331d3813c738875a5
102898 F20101221_AAAPPQ subramanian_v_Page_265.jpg
43cade6778039207c256fc82088a6c67
a81ffb63ae1a7748ba61cc60a043814ed77f3e3f
F20101221_AAAQMJ subramanian_v_Page_020.jp2
44d996fd6112c68bd3426971bf655efc
71696bad84cea316b9d37d46ae7ce2e092356e60
1676 F20101221_AAARJC subramanian_v_Page_077.txt
62835e7d4d431ffc0e6037053234b3c1
afd4894a3daf45527e936083259528e2fe5dd3f6
F20101221_AAAPCF subramanian_v_Page_086.tif
506380b5805aa4cf9aae82197b858102
46b34be74c23139503eb1146bca88e8a38c8d4a4
F20101221_AAAQZU subramanian_v_Page_293.tif
d358041cda8bde1a3650af87e0fcbfd9
0bef15b56618f89ff1cb0003ae1b017f14214d0f
8522 F20101221_AAARWO subramanian_v_Page_129thm.jpg
f0206f7aa3ddfc991c5c45d54a66834c
e0149483f5543e7c347f78515fb37ca535ab11a0
763456 F20101221_AAAPPR subramanian_v_Page_004.jp2
93fb9d16b82f52210cfa68b0c7fad7ce
3560558d55c5fd5ff92fb5fb45d3bb364b7897ba
1051936 F20101221_AAAQMK subramanian_v_Page_021.jp2
ea95d566a22dfbf46499cc96bf0a4acf
fdea63d3b22b41f65d9173bac0cbf08870ba9abc
1524 F20101221_AAARJD subramanian_v_Page_080.txt
04cfc78c96363a42de5878cb01fefc9a
da587c878aaba353ef042bac35d66e9fec3dc531
F20101221_AAAPCG subramanian_v_Page_288.QC.jpg
019da123a17b0b471af7afae9f4c7108
5dcc92b7beca7d8efbd291d8179fae124be5fa57
F20101221_AAAQZV subramanian_v_Page_295.tif
c0c109af75080d4ee2e68517617b7140
abaa8b70e76f9239294893c0d92925c7002a30e3
7459 F20101221_AAARWP subramanian_v_Page_131thm.jpg
c5551a5ae6acf5c6322cdf902823907c
4120dcd5f7a6b57cce3862ce3a95579568563f07
114288 F20101221_AAAPPS subramanian_v_Page_313.jpg
7917d49b379728111b3400fb5b88155e
559b47d390a692014c16e4ac10cbd5bbcdf0be1d
1051947 F20101221_AAAQML subramanian_v_Page_022.jp2
fb97cf915801a25c56011c43cfbe63d1
eeb3d16e565b2039ba6f61adb1126dd9232fdb69
776 F20101221_AAARJE subramanian_v_Page_081.txt
b64945c4de69b7dae51be27aef46aeb3
119e81aceaac5a4c9f32c130ff1b43d3a15ea619
F20101221_AAAPCH subramanian_v_Page_012.QC.jpg
4118fd962da16793bb85d08b5e654626
889ed523443134a42dd831f2a5520bfcc79a907d
F20101221_AAAQZW subramanian_v_Page_296.tif
6be3ea99cf683292d2822ea5b61203da
7fb7797938096cdaee6e3bc5420a268f8255f56e
31479 F20101221_AAARWQ subramanian_v_Page_134.QC.jpg
2986cc777ac209979cbdff63345f126f
e51431834cbf213a71f425ab50b9ed7f069e4a29
31160 F20101221_AAAPPT subramanian_v_Page_227.pro
947363a0709afc3d13575f97b5a5a800
79a259959eb481662708be3ed344269d5218d68c
F20101221_AAAQMM subramanian_v_Page_024.jp2
bb26ca718a97ece921642288edd055b9
5859ad91389399d5f897d4b889ec651135d15f6c
1057 F20101221_AAARJF subramanian_v_Page_082.txt
6490fa8785bd9eee77bdb529fdcfe45f
a3de165504db59344d9791a61fe4157b0c217752
62586 F20101221_AAAPCI subramanian_v_Page_276.jpg
d0fc32055fba8e81d0d993a9109bb9e2
3ef82a5fb404fdd8a115cede6ab46fd52e34d7e7
F20101221_AAAQZX subramanian_v_Page_297.tif
7b6da741b230124e72fb47ccaa38f0d7
f75988d7ad723156f1a62c4285f20ba49fa395c8
6887 F20101221_AAARWR subramanian_v_Page_135thm.jpg
a1e2c824bd8b4db7be62cd3ddaa9428f
9beaba17080dac0368a04247d46c0cf09544dfe4
5914 F20101221_AAAPPU subramanian_v_Page_058thm.jpg
798e83f6a9099e66132edcf72c42a622
652701eb643c20ac97a7eaa54f4f87f837b32ad8
645954 F20101221_AAAQMN subramanian_v_Page_025.jp2
211be2b81588a985e17013c6133c02eb
4e5cedca7f10d873ec4918c38b91d692e12b500b
1082 F20101221_AAARJG subramanian_v_Page_083.txt
d2962067846dcaba6d3ab009d7347d7d
45218cd677b5132ee93f3cbf2f4879bd924966b0
7760 F20101221_AAAPCJ subramanian_v_Page_257thm.jpg
0543ca8b1da4de402c00807d9baf7be6
eedeb6dbfbdf8eb89a5fed9ac69d4abcfc208eb9
F20101221_AAAQZY subramanian_v_Page_299.tif
ffe168ba96678541f815f403a8fb0576
98e17d701e99558d12931eefc056ddb3f1702286
30640 F20101221_AAARWS subramanian_v_Page_136.QC.jpg
d0d4259056c13dce7c3738213db44d29
ee3e959576fd0e58c4b06a87bb3f0e6ae55f0693
29836 F20101221_AAAPPV subramanian_v_Page_202.QC.jpg
0bd75b9a3f580fb7df69576a506db3a3
09dd24e9a9561eb08bf55d943d025b1feb22c1fb
1040059 F20101221_AAAQMO subramanian_v_Page_026.jp2
7c9aa9f42582b785dcfa98b59d936a99
f1d9206e51b2204b01ab5ef5765bb68fd1b93c8e
1144 F20101221_AAARJH subramanian_v_Page_084.txt
b79f6de935cdf037fe84293fa76059f8
e343d3cb308fc0385c7ced4503fe9b7772249cac
1025428 F20101221_AAAPCK subramanian_v_Page_073.jp2
059e77e45eca1196f567129e37ccdba2
59316d245e5cac8de95338234ecf4f701a2ec3ab
F20101221_AAAQZZ subramanian_v_Page_300.tif
4d650bd17f2d1f6281c6f93da2d16535
6c97ca1df42723ad371b6adb6b4a1685a876a252
8458 F20101221_AAARWT subramanian_v_Page_136thm.jpg
2f4124efb9ca2a6ce2cb2aae7b7ce04c
e37aa3c7beae7d9fc2723c3c8af01fe7b4b5f928
F20101221_AAAPPW subramanian_v_Page_108.tif
2d0078d3b03c7d541f3a4707cf70523b
b90635757952947726212504cd084933eec110aa
1051968 F20101221_AAAQMP subramanian_v_Page_028.jp2
276f00412d5b6962bbe7e4339a987837
ef86231bba88ff34a2bff0adc884f2c4679f3be1
888 F20101221_AAARJI subramanian_v_Page_085.txt
951ab7fc6df273e5447cbcb559c9bdf4
25ab26daf4d62629502b8f589246a64852dbc648
29936 F20101221_AAAPCL subramanian_v_Page_257.QC.jpg
2af4779a843636844622b2ad5e9858bb
095049c6f914cc3f4b68257baba58018e997cf81
29233 F20101221_AAARWU subramanian_v_Page_137.QC.jpg
a0b6a409bff7a03cd645e414cd9e0b93
97213c5b8ca80e7af6ca8e19c2e3b6d6d9d439b1
F20101221_AAAPPX subramanian_v_Page_107.jp2
5113743d8d0ffd954ed18268d27c2b85
048a48351e0479b557a5f33537389450efd9abb6
627369 F20101221_AAAQMQ subramanian_v_Page_029.jp2
85a9c136a86424e65a24556decd7dd5a
80c36e159427e96f8548ce3c72e11a58ffe0854d
1817 F20101221_AAARJJ subramanian_v_Page_086.txt
9744d4d2169d51e25080b17885bd29b5
30be16c9f583c5adb0e07531ca8646528dc49e50
F20101221_AAAPCM subramanian_v_Page_104.tif
f9c47637ba8952a1f81c9e7d578aebc7
e1025b61524377fd3f1099cb41b920fddf0daca8
8071 F20101221_AAARWV subramanian_v_Page_137thm.jpg
083560cbe42bd865e05336169ff08603
a049f56039cf4bebe130b89b3e9b54294b2c180f
21235 F20101221_AAAPPY subramanian_v_Page_281.pro
010b0d9f71240d2f40890aeb46d25b74
763d0630ba79392fde37c9fd00a0203f7e62ca55
F20101221_AAAQMR subramanian_v_Page_030.jp2
8821b3677398e80a9c9dc3a41ad9eb75
cfc6e84c3c720bc53acb3126bf958e9002dc9713
1699 F20101221_AAARJK subramanian_v_Page_087.txt
e91837fcffb731d2a09b98d2c7e0b6cc
5512000d29fc3797330e301fd33f1d5d118b023c
4097 F20101221_AAAPCN subramanian_v_Page_002.jpg
0d896ab777c7994fc254c98f9376edbd
827b4a2f2fc0e7081f5fe8c7e5c0f175fde17805
32078 F20101221_AAARWW subramanian_v_Page_138.QC.jpg
9491528e8c717803e8d1a602c09c033f
fc148c7659b05fc7d0c865af9af331486449f721
42296 F20101221_AAAPPZ subramanian_v_Page_205.pro
79345096e742032bbf48aac665e0f96c
f73bd6bcf49fa63a0905a125b707127ca3a8b358
1043713 F20101221_AAAQMS subramanian_v_Page_031.jp2
629d351bb27574886dd65487ef1f7cda
fea6eff18ab44ddbff3997a940f904e5bd29e369
1288 F20101221_AAARJL subramanian_v_Page_089.txt
347e8d97ecfd943c738a8700232baa83
8432fa03a6be07c6aee2f320f25dc44f205eb4f4
990820 F20101221_AAAPCO subramanian_v_Page_246.jp2
94f8d226c889448728edc6ee00ab7c2e
a2449ea07cda080133f78c124409ab895bdc0b49
F20101221_AAARWX subramanian_v_Page_139.QC.jpg
e94995bb2305dbdd3dee5b2c3a6ee83d
15ec746c8cfa786fa36600add2462af803748610
F20101221_AAAQMT subramanian_v_Page_033.jp2
1369228f2e4ce2e083058740351c8fb4
44366e134cdbd448ca983192cd67c479d0e18eea
1059 F20101221_AAARJM subramanian_v_Page_093.txt
8d8050d736b6e57a0d8aaac00bb6445a
f4b5223c2c0f1267a8523d810fe2a8e1d94338bb
F20101221_AAAPCP subramanian_v_Page_273.tif
437193c71916b21360af22927db0a49f
dea15b81d703b3aaee6d857f8bf270694c6b4c2f
6543 F20101221_AAARWY subramanian_v_Page_140thm.jpg
781026ec481c2c1d999123662b50a2c5
04804429f8579bb5ce9c3847bad08c56387d43cc
959140 F20101221_AAAQMU subramanian_v_Page_035.jp2
4cf76b2f7b480dcf2e64260d5d91f77d
ceac3c8980176b7e2c4de3bf887d087cd25e49fb
1201 F20101221_AAARJN subramanian_v_Page_094.txt
e8fae9a5d9258bf17993f401890b3ee6
ea5a74a9049cca2bc682f2f0c220be0e2d5395b1
913 F20101221_AAAOYA subramanian_v_Page_002.pro
858575543ab388ff105592c617a32f1c
5270760d709a5537fb8dd16dc2706fb6ecd70d46
819 F20101221_AAAPCQ subramanian_v_Page_079.txt
9861871685d2379c8bceb54cc20e4182
486cc81fc599d7c7a9e893f064043f78868920be
31187 F20101221_AAARWZ subramanian_v_Page_141.QC.jpg
20448de88f76cad390798d319cb60a96
e45c2d59e51affc50fd7acb1035a2d70d2011814
1014071 F20101221_AAAQMV subramanian_v_Page_036.jp2
accbe931c06bdc9879ac5a02d386fcf3
04027d929dd1b4f79a7826e47374e651ea7c802a
1227 F20101221_AAARJO subramanian_v_Page_095.txt
cc8841a0ebb17de20a19460e181740f2
b738b0becd399d500ea3683efe392ef2dcef66bf
1006661 F20101221_AAAOYB subramanian_v_Page_174.jp2
c7623c3bfc0468589035445224fae7b4
46df4f50ed76752aa6834abb67683d851d9de1d9
825409 F20101221_AAAPCR subramanian_v_Page_140.jp2
4680aee1d7f226f483c1db0a719db496
110f4ca7af43b1d02c78bd1c1dab3531d96f8d3b
884751 F20101221_AAAQMW subramanian_v_Page_038.jp2
6acace9d6f603983d9a6783959f8bee3
ae6899f9357a5d1900b55975e6e3cc7ee50f8f52
1271 F20101221_AAARJP subramanian_v_Page_096.txt
f6b81a47ddab8f35690f83226525dbe4
66e683423a058ae84d8e99a41093e47124f34b25
F20101221_AAAOYC subramanian_v_Page_083.tif
e71fd513cf4c8cb6ceecc3cb1ede5ece
e4f7d9484647ffa23bb0625a518b71a8d39e4962
17056 F20101221_AAAPCS subramanian_v_Page_269.QC.jpg
ec60558cdbd1513af9d0e41f9f97143a
d3282f27f247b24017f09fdd85bb2cbc8604787f
730268 F20101221_AAAQMX subramanian_v_Page_040.jp2
60adab57d9a8d2b9ac4b0ce4b1612a37
1d5e9a9f44630d1cb08e35827c7bc28257fd48ae
1732 F20101221_AAARJQ subramanian_v_Page_099.txt
1ba7102d90a7b75f67c81ba622da0607
cd98d48e898858740d42fdb3053e56bd1805270b
F20101221_AAAOYD subramanian_v_Page_156.txt
e3d17b72c16b3fd7992e1f5e478e1998
8c22e46021993a40d8ee5946e01927af0dbad6a0
56100 F20101221_AAAPCT subramanian_v_Page_172.pro
8145603b469af7e9205592576843deff
c612091ca86cdecf6bd310ae5b309eff0b0e850c
F20101221_AAAQMY subramanian_v_Page_041.jp2
eaf9f7b77f8a04d97d59a3216e9ffc85
50cc0f268c530d55b368fff6c470bd210e94adfd
468 F20101221_AAARJR subramanian_v_Page_100.txt
6d0a03a655ac2d154e130237024c2a7a
ca7a15f13292e97b7e0c441915e9ac96f4a86f8d
24304 F20101221_AAAOYE subramanian_v_Page_232.pro
04e2d6de8a949fe48bab06cf691bac05
ca5bf56fb062cef82abc7dea35ed2cd83007aad7
8703 F20101221_AAAPCU subramanian_v_Page_148thm.jpg
9bab08457f559bd70cc3e4a58cff39db
89bda2e8a698bfcb0bc71f942c731e4c579dc878
F20101221_AAAQMZ subramanian_v_Page_043.jp2
5d44c731c823a09b5567856f158dc63c
1d18ff08c3a63f76bd4a0de97b0f57b09b5b5cc2
1277 F20101221_AAARJS subramanian_v_Page_102.txt
8f3af2cb13c89d8bb3e4cdc53248096a
b2b56958698f657be2b7d5c447a0344a58adf65b
1410 F20101221_AAAOYF subramanian_v_Page_280.txt
9d9759034eb3a75884935fe577b7885d
9b90bbb699818ab77c87e59283cf043eee0408a6
6583 F20101221_AAAPCV subramanian_v_Page_075thm.jpg
97627b58b2d641c7832e155b46887737
bf674b3198f88edc5303b6692b059e3ca091ab08
F20101221_AAARJT subramanian_v_Page_103.txt
3f0730653ad68fbbf7cfef627d583b27
8436d8b94be9e63ec7a3e5e915e09864aba54eec
26184 F20101221_AAAOYG subramanian_v_Page_156.pro
c546d06386749d0fe24e4ce1b2181fbf
5d9169387db985c53546a1c47b95c3f523629905
1706 F20101221_AAAPCW subramanian_v_Page_072.txt
bd5cd3f5fa64277c9e54aa26ce03557c
1700b99da19a3053c5166ebec59fb2ee71bf5b4e
F20101221_AAAPVA subramanian_v_Page_011.jp2
563022ffef95b841ecdaa34d685ded53
aa28bb472e87931d6a4580878e5890a7d7dc8501
1838 F20101221_AAARJU subramanian_v_Page_105.txt
ecd7050b53a69914a88a47c6d56bdd19
8536e1236ab1eff14acde5a9fd50dd38c363cca2
33135 F20101221_AAAOYH subramanian_v_Page_123.QC.jpg
c442c14a56288ed8b5d1471d60591f59
9ddb0e79a96eb3ad9267eed65abf9cfaac86e1ad
19799 F20101221_AAAPVB subramanian_v_Page_232.QC.jpg
35fce516454245769b5b670b92b515e2
21557c4c66ee5ac7cfb3646defd512c184909fdc
2228 F20101221_AAARJV subramanian_v_Page_106.txt
8638fb0769338512337d475f5e33c8f9
e3556c04032e3299fd75aca092fb4e66abc01b11
118293 F20101221_AAAOYI subramanian_v_Page_312.jpg
bda5af6e09e691726efcd7d262bdadb1
8692b6d9b364fc269b25ef16b6af526422907656
57522 F20101221_AAAPCX subramanian_v_Page_262.jpg
48eaacc565dcfd225c1b2b41b9f77791
570436563c605449ebcaba24595047f600b1258e
8551 F20101221_AAAPVC subramanian_v_Page_242thm.jpg
d478d7ecd5ce2c39c886008f88f7b118
9a1f7882ec8c76207a067eb2c8b74d3040def18a
2278 F20101221_AAARJW subramanian_v_Page_107.txt
7f85f3a50327a4187ed6cac2987d62f3
d3e841998c85e870b48b82f70fe1a6fe69986e92
4232490 F20101221_AAAOYJ subramanian_v.pdf
3e30f7b6cdd0cabae1f991c4e31bb828
0c15d430df1704629e2143f7d581e20e070c39e3
3997 F20101221_AAAPCY subramanian_v_Page_281thm.jpg
4c07173512d71fa816ea22a74b27539b
31d3a536492fd37c2f0e5132a8ef046f4859f3df
39914 F20101221_AAAPVD subramanian_v_Page_235.pro
63be2115677317a236fc10d287eea53c
51f50077b985a18977d034d63f514484304a3c0c
1653 F20101221_AAARJX subramanian_v_Page_108.txt
4b798f34dbca753a8ace53a482a8fdf0
9618418d8d3ea360394b89ef912078f146de056d
1686 F20101221_AAAOYK subramanian_v_Page_243.txt
475c1460b0778822f476582a36d8f16c
1254f72f3dd3ae62158fe5ac52b7628e909e4951
32430 F20101221_AAAPCZ subramanian_v_Page_159.QC.jpg
0431f1a200e6a853ed1c200d9957f258
7f4e5d060829d5a59f1252b8be163a79ed1ebd98
1051917 F20101221_AAAPVE subramanian_v_Page_074.jp2
17e7a95a59c59d5d2121df0ff6df843d
7a665f523769fe5b0bc9c9521aef0f04a47cec53
1821 F20101221_AAARJY subramanian_v_Page_109.txt
792a2fb26f0a4be5bea17a53dbbd036f
ede103cbf46f80a7b55a5b4292d80b0dda001b4d
7382 F20101221_AAAOYL subramanian_v_Page_275.pro
e88d4887e87bd9c7d97b38892c5273bd
9bbd35fd4a2aa9fe6f99a621ba9dece14575b9cc
1733 F20101221_AAARJZ subramanian_v_Page_111.txt
58fe3813c5de08d5b52e1d7f60d70080
78946bc49fd17350e15aaee2cf02b2103ee60318
83843 F20101221_AAAPVF subramanian_v_Page_110.jpg
274788e8bb813b93b90e9503a5b480bf
208549b15803a636883d768594b904a48c7e3c1b
1957 F20101221_AAAOYM subramanian_v_Page_185.txt
20ff629567e4f61741f0d0fc5c0c365f
788063737aebd0d2bc78a54234f315800fabda12
55731 F20101221_AAAPVG subramanian_v_Page_271.jpg
69a2c3568c6ee2b7dccfc00b112ca18c
504d7315719ba2634571bca5e7a479d252cb2cea
F20101221_AAAOYN subramanian_v_Page_061.tif
8c6226313555be42cf03fe0d7a2709dd
84a7f956c66685c88a5e37785e1744b62dbe2d65
34488 F20101221_AAAPVH subramanian_v_Page_119.pro
5d0c4bbd694c97b45d37791aac806d22
02c10a5db314a083c3b130426b59e15a79a7c447
357697 F20101221_AAAQSA subramanian_v_Page_272.jp2
9b753a998d6c515df53721562affe5d5
60604690928f67eb1c3cdea6d055672c65abf5e0
1767 F20101221_AAAPVI subramanian_v_Page_208.txt
8090a93e78ac625fab973c859a72b9f2
0f2a856bd5b52b5de028ee6f5b2bf88ad340b69e
403309 F20101221_AAAQSB subramanian_v_Page_273.jp2
51b7082e2eaf613c66aa4abc0d57cb5e
e138bcf942bd0b3e00c3d64ebe9931ea21cad444
28775 F20101221_AAAOYO subramanian_v_Page_118.QC.jpg
09a9644b15755d7885c76d51c2fc7abb
2c396288a25562ecd73d6f345f867abc56c5913f
22514 F20101221_AAAPVJ subramanian_v_Page_140.pro
9c8633b11873ca8db3fe60d72a0e1850
2703e02ed74a13f77a5a120775540f8f51d8c829
351039 F20101221_AAAQSC subramanian_v_Page_274.jp2
0965a5b2da8e85c8ebb49cf3ccac2b95
5f7b464f55bb56641dbeed6882bad4c845a1549d
F20101221_AAAOYP subramanian_v_Page_122.tif
e74484402a073a16b0aa931e1b89e1e9
6d69e03cd67146f275f8a9a40ef434469db1b983
393001 F20101221_AAAQSD subramanian_v_Page_278.jp2
e9ef21e02ecf1c19121911d939e66591
5124bce1b7e12a66214b5ea520f2fd101825d0ae
12735 F20101221_AAAOYQ subramanian_v_Page_308.QC.jpg
90373fb9014510801c7bd98807fee2c7
999ceaf4c230474f4958ada6b1db9e0bbac01531
7426 F20101221_AAAPVK subramanian_v_Page_155thm.jpg
617457802dcf15f1090c7178035bc817
3337f27d9a90bfb656edbdabd06fb3e22c6e6d03
563166 F20101221_AAAQSE subramanian_v_Page_280.jp2
d2ea7d598f84a77a6142cd329e53b963
f839336fcb96beacce90c87488c221526bae3bce
F20101221_AAAOYR subramanian_v_Page_094.jp2
150935be3132bd5a647067ff33e65c57
ad778287bd14cc0df3ff214f1e57289ea5f8bf5a
19324 F20101221_AAAPVL subramanian_v_Page_091.QC.jpg
5b7cbf1d08c71354f9d9b1bdcaba10bb
68c8b1d541a262bc4a947e0cb0fdf1b7a6d97926
501159 F20101221_AAAQSF subramanian_v_Page_283.jp2
7c2c7a414bc71d4f4b0cf591a2f1a8a9
c0baf2c183becdd972350d840a59434661590087
F20101221_AAAPIA subramanian_v_Page_018.txt
4586f176846afd7c58b25915e97b75ea
6cfc4ff671314ebfdef60c9bd0f54d546b9ce513
F20101221_AAAOYS subramanian_v_Page_159.txt
72b0cdcb89d6ecb222b6054e98840b1a
12b844283eb119ce7b768c87ca354602d72d21f2
671 F20101221_AAAPVM subramanian_v_Page_017.txt
4b47d2d6bdb74e983cf65923a797a90a
20205b30f357e76ad06d9b51fdd3b2b7f8e10fde
988 F20101221_AAAPIB subramanian_v_Page_098.txt
98c1496a78934d3086a76bcce7d4e360
516b05c3e0aee38c4001d884ff3ffeafd621b404
7662 F20101221_AAAOYT subramanian_v_Page_224thm.jpg
181b6be148db1d36a3f8bc256bfab7aa
602d43bab7de0f5638c9667d6bc897e001d28e00
33122 F20101221_AAARPA subramanian_v_Page_007.QC.jpg
4d35a68113a099cc386a5eab3fc51216
5322959dcb4916e348c9e80e860a5fa4e399cb30
8580 F20101221_AAAPVN subramanian_v_Page_189thm.jpg
7eecb7dd919d8320c0b6f4912a277d24
dd1f2280b5761948fbb9d7b56faa176617451d7a
743658 F20101221_AAAQSG subramanian_v_Page_286.jp2
9a3db4551773ce0fdcfab68cf592bfad
461029000c749002b2c13763d360541c206f2c89
30141 F20101221_AAAPIC subramanian_v_Page_035.pro
20e1d4877f41eae0e4c2daaa4c0736cb
c7f409b6132b53289c88a5b8102a72f8a6863435
59601 F20101221_AAAOYU subramanian_v_Page_065.jpg
aa09d07ee5a0a76d8f7788d6662a8969
64e51e6b809e0734cc2a14709131ff3281991677
29705 F20101221_AAARPB subramanian_v_Page_211.QC.jpg
6e5c32b5fc16beded3ccc3f92b9d61f6
0b448115e1d900a79671e323ec91c927675f2ea9
1051959 F20101221_AAAPVO subramanian_v_Page_014.jp2
cff22280788796493cea0458b36ba0cb
4934b13e0b5564c9193e2011fd58bcfd2e970fb5
667216 F20101221_AAAQSH subramanian_v_Page_287.jp2
5246a9f82137a901b442302436461276
d4dce1faa87f0101179bc9cf657a687e7bc7cf11
1631 F20101221_AAAPID subramanian_v_Page_238.txt
d109ecf39a61fc1c7beae0f5031620db
c9c690d5edf69fb23e2fa4ce82b0e94ef17b1a2f
110657 F20101221_AAAOYV subramanian_v_Page_237.jpg
571ea5485651f6233aca0f95eac4882d
2f16633321f5ce828ff5964fc3a263e2d0916de4
18864 F20101221_AAAPVP subramanian_v_Page_285.pro
033372fae51d9a48cf2aa5d460e60ccf
484b12872765aaf3bb38cb134216d0b25ee90a1d
562663 F20101221_AAAQSI subramanian_v_Page_288.jp2
5b5620bee2d93e004ec4d76451d82351
8afa873cf37532b42d5f869b5e52d3f73ec6d6ea
F20101221_AAAPIE subramanian_v_Page_027.jp2
21dcd3d455254a8d7fd941e966d07da0
95010875c2e104a8d3cd155db6ab841424282284
49929 F20101221_AAAOYW subramanian_v_Page_250.pro
8b6a6082ee17246f68f52e3a470a0908
c0205443dd8536013d5fadd0b7b796909faf41cb
33546 F20101221_AAARPC subramanian_v_Page_093.QC.jpg
83587bb85c25c8f849d73232c3b46094
cd791460fe8ff5cd92bc49722914faaca522719e
577568 F20101221_AAAPVQ subramanian_v_Page_183.jp2
2c75056a527279d5bd814b788a51ef70
64d12b5bbd53c454e0ba1ddbdb87b0b6a9b24495
430684 F20101221_AAAQSJ subramanian_v_Page_290.jp2
b0c0f49411bb7844ae13828271e7c505
fd00c936a7a4bf8fa3e306648e282a6de208c14a
8097 F20101221_AAAPIF subramanian_v_Page_201thm.jpg
82dcbd7768931b6173599f78a936182e
f67f3fc44d83c87a1ddadd1217960fe31387df61
57182 F20101221_AAAOYX subramanian_v_Page_161.pro
80e763835eb99f7d670e001bd4dd3a03
11e84d920ba97ad9ab11bd9d54865996d5ec1c9d
8758 F20101221_AAARPD subramanian_v_Page_203thm.jpg
aad8f9e9bfcf6370f194e9b1f80c31e1
91da0d1323b0b43f9ff58be7a5bcf7a8b5d86eed
1246 F20101221_AAAPVR subramanian_v_Page_198.txt
0af9791787a8eb5eb37580d961289472
a2de4c977295b59f1e27f573adbe1d36074fbc52
669557 F20101221_AAAQSK subramanian_v_Page_291.jp2
9d95637c66a4f1159d0d8c67003b9a42
d604de7bafa342fbc516430da939f1498c5773f8
1051907 F20101221_AAAPIG subramanian_v_Page_162.jp2
16b86f7d323b8fc0c4aadf80321dcfcf
9f2016ed79d215adec76ea42b8087ab599ecdcc9
4067 F20101221_AAAOYY subramanian_v_Page_298thm.jpg
d96f4ecc6fb33e77427fe02611886445
aa84fa02fd8dbba57a6ca09cce87d545e965aaa3
22979 F20101221_AAARPE subramanian_v_Page_156.QC.jpg
bafeed30ea34e9e241fa5aeca761df4d
d2e3f97a26b2f1c4a1f675361f204d7f7080176a
41387 F20101221_AAAPVS subramanian_v_Page_066.pro
74567c3b439cd088e762276b777829a6
b40dc6a1a26f69ac2876ed5d30c4859098ec26c9
479563 F20101221_AAAQSL subramanian_v_Page_293.jp2
e17f319c52fcbabb3a4d53f2861f0953
1eeedf4cf823fa30261c4470b544d5bf1b1b4b0b
1018081 F20101221_AAAPIH subramanian_v_Page_141.jp2
8810de81dd0e7d348713fac26c034b0a
622cad19479b09b54fadf2cefc86735fb9d1e2b2
F20101221_AAAOYZ subramanian_v_Page_303.txt
d249528b10c4b8cbcc4826f6018ea1bb
a1434da8dda1a6418a313b67c54e91ffe47b90fd
367232 F20101221_AAAQFA UFE0022364_00001.mets
a48a3da960fdf3686378f1b4578ab9d0
d5643fcbf8c9139d4b3df1d6cb132d6634dfd10e
1167 F20101221_AAARPF subramanian_v_Page_002.QC.jpg
f08af9bff3a12a99c82a1e7d2155bef4
659475216f238239c0db56e487dbb368744191ff
6033 F20101221_AAAPVT subramanian_v_Page_004thm.jpg
c9a27b2926dcf6b6424ab5f780b8e742
b2e03ce8f216ca80eb1617c8e811ba1ba1ccc298
563539 F20101221_AAAQSM subramanian_v_Page_294.jp2
e10c544d4e5bb30ff2996f768334b503
a7c15691edac49feaa0c10c54388d99350823c9f
41982 F20101221_AAAPII subramanian_v_Page_257.pro
bb8c586d9709f6714147aef81ffc5390
c5e9a7aa919c5c25c02c134be2e75e5c11dbbff0
8397 F20101221_AAARPG subramanian_v_Page_020thm.jpg
4f06e506eaffd50e9099dd29d167ed1a
1f628440508d4d49648425d3156232ee91e2cc32
46496 F20101221_AAAPVU subramanian_v_Page_120.pro
bac10f555f37a18f089e92c3f1e104bb
79576b4df5ff6b5f878d04cd718c7c11f3c9223e
661304 F20101221_AAAQSN subramanian_v_Page_295.jp2
b2b9664e6cd7ab6d5c37bfbf645b9042
1de38709fcf86ae8c1be0145b5f077af8b821922
113127 F20101221_AAAPIJ subramanian_v_Page_172.jpg
58f33aefdf8a3e2715f3384d353a4d9f
89ec7783dd0235462cb63ed88d3384f01b626973
17407 F20101221_AAARPH subramanian_v_Page_256.QC.jpg
8df15011f2bef8d44c2e71501eece65d
274e63f9e116a8c1399630a4a09b11953d67436e
1051898 F20101221_AAAPVV subramanian_v_Page_258.jp2
8b2aad75e8792309c62f6213bcaad1fe
dc15b858cb35d03f79c8c186c43a1e1e0463ec85
577866 F20101221_AAAQSO subramanian_v_Page_296.jp2
f09941e6b04eeb3feb222ed4647c4278
9ad3e4c82ba37872ab4056ffc4a3a27a72d2656b
100009 F20101221_AAAPIK subramanian_v_Page_020.jpg
297bad75badf122561047b90ebf5ce32
4b958b151d49ed03e2c755b54e4a507a8e79eea6
23509 F20101221_AAAQFD subramanian_v_Page_001.jpg
42a44a284e82a980a270a1f3f445334a
a24fe3f1afc80bb9c6125c870d811a1f31f032e6
8133 F20101221_AAARPI subramanian_v_Page_103thm.jpg
78d94d893d6ea51dedab5bd8ed69c1ec
788e9683c7ab623a01da6222becb00e821be7c81
20291 F20101221_AAAPVW subramanian_v_Page_293.pro
13de2f75ee5f65a4c03bd836125667c2
a7c9f0f5d6047544b2fa3da05de614881f9d7fa1
663054 F20101221_AAAQSP subramanian_v_Page_297.jp2
848cd7b0b56541533176a78b7746c858
cd9c0ef00ddc21f4e74972b8e8cc93c6cb5f23b9
581540 F20101221_AAAPIL subramanian_v_Page_225.jp2
5336efd9230298a266aa309e14c720f1
ed8dcbe18a293265ef2a6f5898e5231dada9ae0e
72198 F20101221_AAAQFE subramanian_v_Page_004.jpg
ffe4ffc5af2f301a696f6eb14921304f
ef7235939cea113ba8dab448b763402654a79393
27444 F20101221_AAARPJ subramanian_v_Page_155.QC.jpg
d016f644df86dfd278c47ce359b400cb
03f78f5bd53afda4f2590dc6e4344673c4fad65a
F20101221_AAAPVX subramanian_v_Page_204.tif
7931654a7fd3991c5c6e9807ccecd8eb
2112105c371c7e241dabfc39b64e9d1268c2c998
517701 F20101221_AAAQSQ subramanian_v_Page_298.jp2
acba4caa334e1901ce656e7f057211fa
b9a4684f501689ddf19d33db2f378ce668099ff1
4890 F20101221_AAAPIM subramanian_v_Page_284thm.jpg
9f5eb9791d9db60a318034ea2c3efd5d
2903a0f9f9df8a6d99dd897d78c5776457c77f99
122906 F20101221_AAAQFF subramanian_v_Page_005.jpg
88692a3aa26e8fda49379ca04fbd1cef
661c50378dcf8718119da34e0c1786559b3103b3
31060 F20101221_AAARPK subramanian_v_Page_120.QC.jpg
e1a4c4106e1a836c568d8ca946053b72
e310082871b14c31894628c36d368d4a94668f44
81642 F20101221_AAAPVY subramanian_v_Page_216.jpg
fdab859bf60215640b6977e438380f93
d8b95c7b8f3699dd81469714dc45a68d20143eba
529320 F20101221_AAAQSR subramanian_v_Page_300.jp2
d90088de467bef30ad72fcd1bb265609
65409d2d6d30247aa4ed609b2383ab27863df77e
987786 F20101221_AAAPIN subramanian_v_Page_109.jp2
d57d05f7c679d9c3ba36346e1b1945ba
10baa1ae8ebfe3b190371e819ec502cf753ec2c3
152841 F20101221_AAAQFG subramanian_v_Page_006.jpg
cd461b9a05eb6d48e86c8c69a8abbc23
9f11427adf0f121c01c17a3fcad5ed8856a4eee1
28089 F20101221_AAARPL subramanian_v_Page_214.QC.jpg
ac1a87072cd59b5e6ee5e424e18d0554
b5e33c59fa9cc5714d52952be0ab458bb977bde9
40335 F20101221_AAARCA subramanian_v_Page_077.pro
e480146fa5ace729183127b6340e34e5
61a7436c8c99ddd292d282e6da65b7ee916e72cf
516013 F20101221_AAAQSS subramanian_v_Page_301.jp2
02c68fed7a684d75e57926cb771be421
5369690f3e55b2ae4e3602c6c8af352ef3ba387c
37476 F20101221_AAAPIO subramanian_v_Page_124.pro
30b2e96b1203236971ae12ddc612b517
b24784df67f02110a744057d572bd61e6e086a03
146954 F20101221_AAAQFH subramanian_v_Page_007.jpg
21267abbcbad3f2b0075083b78ab3b4c
eaef776a6c75cf3f257c0e35b195f7adc3a415bc
18970 F20101221_AAAPVZ subramanian_v_Page_234.pro
af34e46b4408e01407e75e38293a153b
c2161843632770f6fea1406363dba5c09b142fa2
4640 F20101221_AAARPM subramanian_v_Page_142thm.jpg
3072577bb9b7a0bc9f9247389a8787c0
4953f270c4a6ce4e60c39265af4abcaf01383a78
16179 F20101221_AAARCB subramanian_v_Page_079.pro
201b89dbdc9e2eb80e2a84ff40abac19
4013d328f03b22cb44579ad6b4e1b74219a5002e
570484 F20101221_AAAQST subramanian_v_Page_302.jp2
0e4670e503fc8ee257cfba6ee806569a
a232de0d9e55603762c4ddf6be6ca6d898a69271
F20101221_AAAPIP subramanian_v_Page_180.tif
c70a35ec0a6546ce3a50e2de6a6b6086
29d891dfa02c4e68e622c3d8bb34238c2237a019
139233 F20101221_AAAQFI subramanian_v_Page_008.jpg
fc3a62b3d41085640fc79f16618b9716
fb4cb716f9b306811013fdc8ca4b687f27675eef
13037 F20101221_AAARPN subramanian_v_Page_292.QC.jpg
8e3c28a78ecadd7bbe9b259a7e0761a8
691a8aacadcdee04a2b66b4f0bffcee0e4ea36d4
24676 F20101221_AAARCC subramanian_v_Page_083.pro
4e8d5cb206b879706123d1588fecc47b
8ad00077c32ee324261f05e97a6b32d75150a01e
752290 F20101221_AAAQSU subramanian_v_Page_305.jp2
d65e968ecc305cf09d9a1c8d7631f962
d8345b98292e5fe47e7db74a7fa7ff68d68104bf
F20101221_AAAPIQ subramanian_v_Page_023.jp2
6ba76915649a8ba90537410b83e49840
03c0635fe83cfeec13a8d2ebb12a506399b4b51e
44373 F20101221_AAAQFJ subramanian_v_Page_009.jpg
c80916e4c48379b160b52e02b0b89741
646b3d76fc463878066c35273fcf379bc46483bf
5458 F20101221_AAARPO subramanian_v_Page_160thm.jpg
5924e65a7d7443d109f67d31946ba882
e9355f66119136854203ee4e8ce5eee9870dbc05
44046 F20101221_AAARCD subramanian_v_Page_086.pro
8b91feec21db15cfad36ee4548f15c6d
0d35fd11fab779ba6e8e6f07fae1b96a67496cad
502769 F20101221_AAAQSV subramanian_v_Page_306.jp2
a22fca733aef35d5b8052d38827110fc
a8adb6cabdefb78468ff0c4b1c98612e1c4fccd4
30599 F20101221_AAAPIR subramanian_v_Page_201.QC.jpg
796ad828d8760b8a6189eaee0708224b
842358842ae117eb0db00589364e509c758294a0
118525 F20101221_AAAQFK subramanian_v_Page_012.jpg
72f927a674c122119ce91727e33bdac0
c0205583db73b84db6a18518bda6c2bee3233fed
29408 F20101221_AAARPP subramanian_v_Page_224.QC.jpg
d1fd7b7b234a304f52773550920e4036
3c64be420bb24d6ee8df6b314b70d42722d237c2
25625 F20101221_AAARCE subramanian_v_Page_093.pro
7238735d29b2fa1d59c181e89cc56d9e
5c7932e91ac06c7ae9bbcdc0b6ebe5aad68382f6
522622 F20101221_AAAQSW subramanian_v_Page_307.jp2
e5ec2fdf3a80fd1b24b468d7e54a90a1
b3173cf31c197446211eabe989de807c63627c95
92634 F20101221_AAAPIS subramanian_v_Page_239.jpg
baa4c50eb9cd2f2b30a55c3025a25bc6
61a776e219d0ec8897035f66825fd0d51b306f0e
114886 F20101221_AAAQFL subramanian_v_Page_013.jpg
2bce4044603f994c07cf5d063a980638
c67ae235dc38150d0c88f4178c749ba761e6dc26
7295 F20101221_AAARPQ subramanian_v_Page_001.QC.jpg
62c09b7971b3ade00c077c091dfb3732
693a20752f10313c8b9e68c1a97be50144e70011
29333 F20101221_AAARCF subramanian_v_Page_094.pro
9e7ae9c36569b117ad2371e1205ed2e5
b748fe2f5d81da67e9e3a33716bd080786aa74de
454031 F20101221_AAAQSX subramanian_v_Page_308.jp2
267f76f8578a9c6c2e39f8f576427499
b6187b644e5b81203b204c0dd5017ee9426432a7
4514 F20101221_AAAPIT subramanian_v_Page_302thm.jpg
27b30db50db2b23ef8dc80c8a3325807
d2d6f44e51ce3cdc9a316cdfe7338c4d5718e229
119702 F20101221_AAAQFM subramanian_v_Page_016.jpg
100ea0fc9ddfa37c756b15ef13831964
5db08d58b14972eb6d50d39e99b07b5691d67688
7568 F20101221_AAARPR subramanian_v_Page_018thm.jpg
0054c589e83ee6128224bb733e5f451c
3668b6c23cf10ba36b2a793c6b153a99418b59c9
40365 F20101221_AAARCG subramanian_v_Page_099.pro
8407e4db95f6383d464847da024cd8fc
c376ea7f4120aee215ca3024d0630eebc5180a4c
F20101221_AAAQSY subramanian_v_Page_311.jp2
b17182910a4d47200ac1d1da1b0c7b90
6f8a2fef9357363a97c025d47ba204bcbeb5e1ed
F20101221_AAAPIU subramanian_v_Page_168.tif
10efb4672fab5a618e5a41251e244088
f097feb55ff431c61f67d120873648f8a040dee4
35450 F20101221_AAAQFN subramanian_v_Page_017.jpg
211b50a453d7ba1c6c75d21a643b0ede
e74fa7b3f0ccc99af49bd42a9a3d67321002b79b
5085 F20101221_AAARPS subramanian_v_Page_256thm.jpg
0819c02c3f1eeffece8755b66def700a
9f85654cabfc8c5bbb6ed15199418f0343a1bd5c
8408 F20101221_AAARCH subramanian_v_Page_100.pro
eaf59820218bc7c4ae47e64f499704ec
4df2aa40244c12c71420950c45f3cc91587f8e40
1051933 F20101221_AAAQSZ subramanian_v_Page_312.jp2
247288205db3f8b198d442d6cd78d349
a2d9176c192d3b653e29526c798584e7f0f800f6
F20101221_AAAPIV subramanian_v_Page_303.tif
3e3f05d0a01af1b98943d60da3bcd02f
08e2d3549a07cda78093518ac0799d60fb22a538
95385 F20101221_AAAQFO subramanian_v_Page_018.jpg
9610e3d092d18a083a4864572e4cdd43
9ae19bbec8dac6f76f528c0965a81e0fd2fe6e73
15723 F20101221_AAARPT subramanian_v_Page_279.QC.jpg
8241b42640526f7d3a186e52c8b7306a
b8d901f5b7b8b859603d2753d57edb1c6d43a800
34648 F20101221_AAARCI subramanian_v_Page_101.pro
c6896c665b36916cb9aabb590ffd108b
6cdec202c379f0d8077c8f03d98834382b4835c5
F20101221_AAAPIW subramanian_v_Page_240.jp2
84cea34a4be1b58d191ab20c6cb8a7fb
b221e911aecde49a791ec1aa70fecc2e23ed0383
65137 F20101221_AAAQFP subramanian_v_Page_019.jpg
b85a63e51defd90a36858e2e45950f86
bbf592e5d4dcad128ba4ee2e675f1b4d749c2800
8274 F20101221_AAARPU subramanian_v_Page_258thm.jpg
243b22620f0ba7ac5a752545aa4dc063
a1b7b18c41a80c1ceec1d1fb5766a411d97b70a0
27875 F20101221_AAARCJ subramanian_v_Page_102.pro
18f3f5808387c1e108ec5a24ac35b88d
b827eaf98580332aa0e21ceccf483277ca3e02dc
37916 F20101221_AAAPIX subramanian_v_Page_027.pro
62453f59341d79c4cc7e642dbd3b9789
28b3e94822b6bb5644c4a49034e35cb5495caf1a
113037 F20101221_AAAQFQ subramanian_v_Page_021.jpg
62d2e57378f96c379c5e38e904bdfc24
ca98628d50324a198c55e8b17ee7552b7fae2b7d
17799 F20101221_AAARPV subramanian_v_Page_270.QC.jpg
2b805d233cb270bea4890dea5a1ffa76
11d8ff005c687e2999f707acb0ab32e2414ed2fa
45848 F20101221_AAARCK subramanian_v_Page_105.pro
97892593ffbbb0c62d19c32867f13820
6017aeeec955442c632729710c8ea79ad7fe288f
F20101221_AAAPIY subramanian_v_Page_253.tif
0222f864684caf33b6b5ac1c197a8a2e
e8acb56b01dc128ed46f6eaa7a6476907f52b51d
109512 F20101221_AAAQFR subramanian_v_Page_022.jpg
c2c8b5d8d25b75d2522058cd0c884e97
50c0a7ea92d1bf9e78f3f1b11b5dd5996dc41bfd
32289 F20101221_AAARPW subramanian_v_Page_145.QC.jpg
dc21cc63b113dd639053847aa9361c4e
99de11b0a498567669013811e448a0ab250c4c76
58171 F20101221_AAARCL subramanian_v_Page_107.pro
6d06b09d928876f8be143e41d3fc678b
3e4d550c791b6ca1742fe2c66f6310cad1593b02
49090 F20101221_AAAPIZ subramanian_v_Page_051.pro
21177657d0c30145f7737c8d5a15a45d
24483ba3d87bb45699924c2d504f3d478d15264a
105025 F20101221_AAAQFS subramanian_v_Page_023.jpg
fdca9e90d4a0e225267be8ca7c7cc128
06f2eb1260b2b79c314ec93485e657415a06e17d
F20101221_AAARPX subramanian_v_Page_107thm.jpg
bc44149d4311da5b3a3c27bd82f74515
1eff4ee59c96b61a686be41b5dcff3296398656e
39485 F20101221_AAARCM subramanian_v_Page_108.pro
7dfd669187534880de9112af386b74c3
d8a32f173d48f0bccb883d4e7ee6e12d52e606f5
104996 F20101221_AAAQFT subramanian_v_Page_024.jpg
d30fa683aea03bd2b920945b7a40d639
804d7a18ba407f58e13f7a2f73ec894db80f0b8f
7902 F20101221_AAARPY subramanian_v_Page_139thm.jpg
5c78efc5121f5fa1acf5d5efb676b787
530559c1798db4ba64df31daa5350b57ab2ca0c7
45239 F20101221_AAARCN subramanian_v_Page_109.pro
d507f558c7f8229ff1a9c56094280770
82bdca13cb3672a4a5fbe28870804763737acfda
94895 F20101221_AAAQFU subramanian_v_Page_026.jpg
37a9b0af365360a04400d1a093b276a0
7bea32ab54b18c7098d8b246e84d91f6ab6b59b3
4433 F20101221_AAARPZ subramanian_v_Page_300thm.jpg
d5e54a4216d42e82a11209492b87f101
f7d31effe8362d6885a077afb05e5115ef55e3d9
46358 F20101221_AAARCO subramanian_v_Page_113.pro
0517e20dbaf9d06cd8f0452e504c45cb
3ccbac013b7135a91b40f879fd2b45f5e60c8f5d
97740 F20101221_AAAQFV subramanian_v_Page_027.jpg
66abe91db2d9893624b86dfdea47953c
668c3d57609fcb73d411eacd946c539751af7980
49652 F20101221_AAARCP subramanian_v_Page_114.pro
527e195453c50b24af2d65b16f2f223c
40a365f9f634eef8137feeb70c55bd9860b51a7c
101419 F20101221_AAAQFW subramanian_v_Page_028.jpg
0c0879216b177f0d433a6ae2336454d0
6f970b310403c493ef27582b98cb716d271143de
F20101221_AAAQYA subramanian_v_Page_229.tif
a45a3ac54d8fa0bdf536d7061dd69cb4
92dfc77587c02a9003b288b0be9522fdda4e2a5d
50161 F20101221_AAARCQ subramanian_v_Page_116.pro
1d799daba5fe0a80d55779bd3a17904a
ebb58f9029adad3238e45edffcb4334f84890042
62893 F20101221_AAAQFX subramanian_v_Page_029.jpg
90e1e16d25c1f22a8e997da98854be53
5eda96093227a43901e75ffaf78d365fb3aded6f
F20101221_AAAQYB subramanian_v_Page_230.tif
29379697b685a84023fe58f38398d392
5b4176620b0462e7df08bcfc1cfb5e56d1797393
52834 F20101221_AAARCR subramanian_v_Page_122.pro
7a6b7aa0b0657041ffe9d91c3036d788
efd80cf3aba2c0ac6cedacb8885e511643ecb837
102376 F20101221_AAAQFY subramanian_v_Page_034.jpg
a1dbf822c735f6754b3176bf6fcb3b48
663d406eefb2f0273b40bf8beffd64c62552fa3e
F20101221_AAAQYC subramanian_v_Page_233.tif
34d2694b87b150f45788647fae55942b
98a5245768add3323c2dec8b64fa9bf1ea3a36be
50788 F20101221_AAARCS subramanian_v_Page_123.pro
861b9143e36955f9929de86ba43bd4bd
c426e0f12afcafebc427862cc7cbbcbd5014fab1
F20101221_AAAQYD subramanian_v_Page_234.tif
7bdf22bd9b616dbd3bbbf9565564e0b9
f930841dc515b70dbd2fb9f7fd6d2be2340bb96a
56928 F20101221_AAARCT subramanian_v_Page_128.pro
0d3ca2b62155d154f3ef1cd97429ac38
38d4c9da1f3ff267f2f2516ce271e31d9a47ab45
82702 F20101221_AAAQFZ subramanian_v_Page_035.jpg
f0ac71b38337de14a4097956400332ca
e224e0a276cdedf3a42c878ef97bee0d7706351a
F20101221_AAAQYE subramanian_v_Page_236.tif
ce980c07402a94955c246c324d8b7e2e
1758348822a5d48a748210c4a83652d1c41be7c3
40435 F20101221_AAARCU subramanian_v_Page_131.pro
a1b91c335573f84a422972acb2227862
46de6706ed2bbba7159270e153329b57ce8d5067
38492 F20101221_AAAPOA subramanian_v_Page_243.pro
3e43e25719dbbd969568b47bf59eff08
758b685d345916572055e79346e91d5b90786232
F20101221_AAAQYF subramanian_v_Page_237.tif
5dfb76e137fe8a3e7744e3add0bb4920
d8b0ea40d9d580819ff1159bd440cdd98ac38af0
531923 F20101221_AAAPOB subramanian_v_Page_282.jp2
2e596b0cd81df005e01fdcb984333226
fcbf0545e5bcaa3dbb9c3dbbd0e7134e0412a541
F20101221_AAAQYG subramanian_v_Page_238.tif
c1c53f2cb8c1af0a8e819ead7e4e179a
9aa1a4f280e7bc36edc4c8e1bc6145e265a0b3b7
27406 F20101221_AAARCV subramanian_v_Page_134.pro
2e05e97fc2583c09d98c9f8a414e2603
bbccb810fd3ed7d0e948145336d0347655673314
1999 F20101221_AAAPOC subramanian_v_Page_028.txt
9d96371408b8d0991e6b9b9ee4dd2475
e9209fbdf6e25a97498121be1babc1f92edd5c2e
14891 F20101221_AAARVA subramanian_v_Page_092.QC.jpg
e81a26cbc33a81f01a7b90cc30817244
267bf6c61b4b622b99f27b5dd1a595a53194deb0
F20101221_AAAQYH subramanian_v_Page_239.tif
cd4af4bb20fb8d954317d4d0bdf391a6
d38555ba8cb4628b2d389ee1cf9f733adca30ab6
24846 F20101221_AAARCW subramanian_v_Page_135.pro
456afcb30a670cdc703763024238fcd0
b1f0a202a3937ef895d7d8b1b3496166f76ba8c5
4155 F20101221_AAARVB subramanian_v_Page_092thm.jpg
96fc6d693a6f6ab7da184fe2c8904fd2
107cd0ce66cef95aee6a910114fed834cb979166
F20101221_AAAQYI subramanian_v_Page_241.tif
6ebd56043d1558e045d10c3c1555e3c7
479ae4abd2b5fe8ad53bff4d4674bca9794d69dd
18239 F20101221_AAARCX subramanian_v_Page_136.pro
83a9254a2d70fc7e5eec3a170921cb5f
dee9fcf0d8d94701b0dff665adb0e3ca8565fc22
60716 F20101221_AAAPOD subramanian_v_Page_025.jpg
0709d2dfbd30c002b4d5406971641e0c
a05e798ca46dbd56cf46c85c0303d26f784f6c52
8800 F20101221_AAARVC subramanian_v_Page_093thm.jpg
26e3a4a43e2f952049304136d6cec9a3
90cefc5e3ba82666f6fa15c88283fc9194fe4fd9
F20101221_AAAQYJ subramanian_v_Page_243.tif
38265cb3ab0c9f4fa96e376359058a26
4dcc12f3c384edf03609353370b11bbbf216bcc1
38551 F20101221_AAARCY subramanian_v_Page_137.pro
196349ecdd20f6a2c44c41cbcad84f0b
88a4b05259146b91d41baf4bf814a982e78e0f60
27728 F20101221_AAAPOE subramanian_v_Page_211.pro
c420860e8a4bcadd0a24fd6b74da5bb2
bd9f1389e30e4b77aa46dd3a4c257aa01eb0b9d8
9275 F20101221_AAARVD subramanian_v_Page_094thm.jpg
bbdf68e0d6d1c61e2f731ceada55f0fc
017d2a8f180ffe08373b0346e9bb732cb0c4ed81
F20101221_AAAQYK subramanian_v_Page_245.tif
be7f81fb37f589177c85557f107eb4e7
a1de5fcd89982fcc2e10032682ec67edfa01bba5
48196 F20101221_AAARCZ subramanian_v_Page_138.pro
753bbca84534e8000927c01ab73438a1
1e8f0c671e983850dea5065ed895bd369f28ff6d
8055 F20101221_AAAPOF subramanian_v_Page_206thm.jpg
b58b0991fb42c5d5b6c36261bc4d8164
5c4c9f7db2378cc1dd2b4f4039016e42f3d37ef7
F20101221_AAARVE subramanian_v_Page_095thm.jpg
aa021541089a5ed1778c8253d135087a
f262c0de150aae8b342e020cd99f2ecc232d4f97
F20101221_AAAQYL subramanian_v_Page_246.tif
f266fe6742696f91b115f6a3692592a9
900888a1571f35a102fa0410f38f48fbfd3b9682
F20101221_AAAPOG subramanian_v_Page_192.tif
ebe42a4e6b6c2572c7dcc0a4d13208c8
d6ebcc4042878c3e23d072b833c7f91d64c27bb1







AUTONOMOUS VEHICLE GUIDANCE
FOR CITRUS GROVE NAVIGATION


















By

VIJAY SUBRAMANIAN


A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL
OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT
OF THE REQUIREMENTS FOR THE DEGREE OF
DOCTOR OF PHILOSOPHY

UNIVERSITY OF FLORIDA

2008































O 2008 Vijay Subramanian































To Mankind









ACKNOWLEDGMENTS

I thank my advisor and dissertation chair Dr. Thomas Burks for helping me successfully

complete all my graduate studies at the University of Florida. His wisdom and advice has helped

me sail smoothly through the graduate studies at UF.

I am thankful to Dr. Arroyo, Dr. Lee, Dr. Dixon, Dr. Slaton and Dr. Moskow for being a

part of my dissertation committee. Their ideas and insights have been valuable. I am grateful to

Dr. Arroyo for bringing robotics from the books to my hands through his courses.

I thank Greg Pugh for his help in building the electronic boards and in writing the device

drivers. But for him, this work would have taken a lot more time to complete. I thank Mike

Zingaro for building the pretty mounts on the vehicle and patiently helping collect data in the

field during the summer months in Florida.

I thank the graduate committee of the Department of Agricultural and Biological

Engineering for providing the flexibility in taking courses from other departments. This has been

very helpful for me in taking many courses that I like in other departments. I am thankful to the

graduate students and alumni of this department for sharing their insights into getting through

graduate research. I thank all the people who have influenced this work.












TABLE OF CONTENTS


page

ACKNOWLEDGMENTS .............. ...............4.....


LIST OF TABLES ............ ....._.. ...............10....


LIST OF FIGURES ............_. ...._... ...............11....


AB S TRAC T ............._. .......... ..............._ 18...


CHAPTER


1 INTRODUCTION ................. ...............20.......... ......


Florida Citrus ................ ...... .. ...... ...............20......
Automation and Robotics in Citrus Groves ................. ...............20...............
Autonomous Vehicles for Citrus Groves............... ...............23.

Obj ectives ................. ...............25.......... .....

2 LITERATURE REVIEW ................. ...............26................


Agricultural Autonomous Navigation Applications ................. ...............................26
Guidance Vehicle ................. ...............26...
Guidance by Remote Control .............. ...............27....
Direct Guidance ................. ...............28.................
Guidance Using Arm ................. ...............28........... ....
Overhead Guidance ............... ......... ..............2
Guidance on Tracks and Leader Cables ............... ...............30....
Distance Metering and Triangulation Based Guidance ................. ........................30
Guidance by Dead Reckoning ................. ...............3.. 1....___...
Guidance Using Ultrasonic Sensors ...._ ......_____ ..... ...............31...
Wireless/Sensor Networks ................. ...............32....... ......
Guidance Using Inertial Sensors .............. ...............32....
GPS Based Guidance Systems .............. ...............33....
Machine Vision for Autonomous Navigation............... ...............3
Laser Radar (Ladar) for Autonomous Navigation ........__............._. ............... 37. ...
Intelligence for Autonomous Navigation ............... ...............38....
Vehicle Dynamics Modeling and Steering Control ...._.. ................. ........_.._.......3
Vehicle M odeling ................. ...............39......_......
Steering Control............... ...............41
PLC control .............. ...............42....
ON/OFF control .............. ...............42....
PID control .............. ...............42....

Ad aptive control ............. ..... ...............43...
Fuzzy logic ............. ..... __ ...............43...
Neural networks .............. ...............44....












Behavior-based control .............. ...............44....

Optimal control .............. ...............44....
Non-linear control .................. ...............45..
Sensor Fusion Applications in Navigation .............. ...............45....
Pure Pursuit Steering Control .............. ...............47....
Headland Turning Applications............... ..............4
Path Planning ........._..... .....___ ...............50....
Current Commercial Systems ........._._ ...... .__ ...............51...


3 EXPERIMENTAL METHODS .............. ...............52....


Sensors, Control and Hydraulic Architecture of the Tractor ........._._ ......._. ..............52
Guidance System Hardware Architecture .............. ...............52....
V ehicle ............... ...... ...............53...
Autonomous Steering ............ _...... ._ ...............55....
Servo Valve .............. ...............58....
Guidance S ensor: Camera .............. ...............61....
Frame Grabber Board ............_...... .__ ...............62...
Guidance Sensor: Laser Radar .............. ...............63....
Encoder ............ ..... .._ ...............64...

Com puter ............ ..... .._ ...............66...
M i crocontroller ............ ...... .._ ...............67..

Amplifier Circuit .............. ...............68....
DGP S Receiver............... ...............69
Power Supply.................... ..............6
Inertial M easurement Unit ........._... ...... ..... ...............70....

Speed Sensor .............. .. ........ .. .. ...............7
Experimental Approaches and Theoretical Development ........._._._......____ ..............71
Vehicle Dynamics, Calibration and Steering Control of Tractor ............... ........._.....71
Valve calibration .............. ...............71....
Encoder calibration .............. ...............73....
Vehicle dynamics .............. ...............74....
Control system ................. .... ............7
Vision and Ladar Independent Steering .............. ...............83....
Vision for path with hay bales............... ...............85.
Al gorithm fl owchart ............ ..... .._ ............... 1....
Vision for citrus grove.................. ...............92
Laser radar (ladar) for path detection ................. ............. ............. ......10
Serial communication between PC and microcontroller............... ............10

Operating speed of the guidance system ..............._ ...............105........._...
Sensor Fusion Based Autonomous Steering ................_ ........._ ............ ......10
Performance of independent sensor based autonomous steering ................. .........106
Need for sensor fusion based autonomous steering .............. .....................0
Kalman filter for autonomous guidance ........._.._........ ...... .... ........__. ........10
Reliability factor of primary guidance sensors in the Kalman filter and fuzzy
logic sensor supervisor ........._.._.. ...... ...............115....
Divergence detection and fuzzy logic correction ........_.._......... ..............._..122












Sample simulation result of the sensor fusion method............_.__ ........._._ .....126
Headland Turning and Navigation .............. ...............128....
Pure pursuit steering ........._.. ..... ._ ...............129....
Headland turning maneuvers .........._.._ ......._. ...............134...
Open Field Navigation and Obstacle Detection ...._.._.._ ..... .._._. ......_.._........147
Open field navigation ........._._ .......__. ...............148..
Obstacle detection .............. ...............154....


4 SOFTWARE DEVELOPMENT .............. ...............159....


Introducti on ........._..... .... ... ........__._ ............15
Software Architecture for Tractor Guidance ........._._........___...... ............5
PID Based Steering Control............... ..... .............16
Vi sion and Ladar B ased Path Segmentati on ................. ....___ ...............162
Sensor Fusion Based Navigation ........._.___..... .___ ...............164...
Software Architecture for eGator Guidance ......__....._.__._ ......._._. ............6
Pure Pursuit Steering .............. ...............167....
Headland Turning ........._._... ..........._._. ... .. ...............167...
Open Field Navigation and Obstacle Detection .............. ...............169....
Open Field Navigation .............. ...............169....
Obstacle Detection. ........._.___..... .__. ...............170....


5 DEVELOPMENT OF MACHINE VISION AND LASER RADAR BASED
AUTONOMOUS VEHICLE GUIDANCE SYSTEMS FOR CITRUS GROVE
NAVIGATION ................. ................. 171........ .....


Introducti on ................. ...............171................
M materials and M ethods .............. ...............174....

Experimental Procedure............... ...............18
Results and Discussion ................ ...............186................
Conclusions............... ..............19


6 SENSOR FUSION USINTG FUZZY LOGIC ENHANCED KALMAN FILTER FOR
AUTONOMOUS VEHICLE GUIDANCE ................. ...............192................


Introducti on ................. ...............192................
Materials and Methods .............. ...............196....
Kalm an Filter ................... .......... ...............198......
State transition model ................. ...............198................
Measurement model .............. ...............200....
Filter gain .............. ............. ....... .... ... ............20
Reliability Factor of Primary Guidance Sensors in the Kalman Filter and Fuzzy
Logic Sensor Supervisor ................. ... ......... ...............203.....
Divergence Detection and Fuzzy Logic Correction .............. ..... ............... 20
Simulation ................. ...............208................

Experimental Procedure............... ...............20
Results and Discussion .............. ...............212....












Conclusions............... ..............21


7 HEADLAND TURNING MANEUVER OF AN AUTONOMOUS VEHICLE
NAVIGATING A CITRUS GROVE USINTG MACHINE VISION AND SWEEPING
LADAR ............. ...... ._ ...............218..


Introducti on ............ ....... __ ...............218...
M materials and M ethods .............. ...............218....
Pure Pursuit Steering .............. ...............219....
Headland Turning Maneuvers .............. ...... ...............224
Determining the approach of the headland. ...._._._.. .... .._.... ......._._.........224
Precisely establishing the end of the row ...._... ................. ............... .....22
Drawback of the vision based headland detection ................. ........... ...........228
Precisely establishing end of the row using sweeping ladar ................. ...............229
Vision and ladar based headland determination................ ............23

Navigating the headland and entering next row............... ...............236..
Experimental Procedures ............ ............ ...............2 8....
Experiments in the Test Track............... ...............238.
Experiments in the Citrus Grove ................ ...............240........... ...
Results and Discussion .............. ...............242....
Conclusions............... ..............24


8 OPEN FIELD NAVIGATION AND OBSTACLE DETECTION .............. ...................246


Introducti on ................. ...............246................
M materials and M ethods .............. ...............247....

Open Field Navigation .............. ...............247....
Obstacle Detection............... ...............25

Experimental Methods............... ...............257
GPS Based Navigation .............. ...............257....
Obstacle Detection............... ...............25
Results and Discussion .............. ...............260....
GPS Based Navigation .............. ...............260....
Obstacle Detection............... ...............26
Conclusions............... ..............26


9 RESEARCH SYNTHESIS AND OVERVIEW .....__.....___ ........... ............26


10 CONCLUSION AND FUTURE WORK .............. ...............267....


Conclusions............... ..............26
Future Work............... ...............268.


APPENDIX


A CODE SEGMENT FOR PID BASED STEERING CONTROL ................. ................ ...270


B CODE SEGMENT FOR VISION BASED PATH SEGMENTATION .............. ...............276












C CODE SEGMENT FOR LADAR BASED PATH NAVIGATION .............. ........._....2 84


D CODE SEGMENT F OR SENSOR FUSION BA SED ALGORITHMS............._._... ...........28 7


E CODE SEGMENT FOR PURE PURSUIT STEERING ......... ................. ...............290


F CODE SEGMENT FOR HEADLAND TURNING ....__ ................ .........._. ...291


G CODE SEGMENT FOR OPEN FIELD NAVIGATION AND OBSTACLE
DETECTION ................. ...............304................


LIST OF REFERENCES ................. ...............309................


BIOGRAPHICAL SKETCH ................. ...............3.. 14.............










LIST OF TABLES


Table page

3-1 Specifications of the electro-hydraulic valve ................. ...............59...............

3-2 Open loop frequency response tests at various speeds and frequencies ............................75

3-3 Performance criteria of the controller (Simulated results) ................. .......................83

3-4 Fuzzy logic rule set for sensor supervisor ................. ...............119.............

3-5 Fuzzy logic rule set for divergence correction............... ..............12

3-6 Calibration of radius of curvature of vehicle with steering angle ................. ................132

5-1 Performance measures of the vehicle' s guidance system obtained from the
experiments conducted in the straight test path ................. ..........._._ ..... 187..._ ..

5-2 Performance measures of the vehicle' s guidance system obtained from the
experiments conducted in the curved test path at 3.1m/s ........._.. ....... ._. ............190

6-1 Fuzzy logic rule set for sensor supervisor. ................. ....___ ............ ......20

6-2 Fuzzy logic rule set for divergence correction............... ..............20

6-4 Performance measures obtained from the experiments conducted in grove alleyway ....215

7-1 Calibration of radius of curvature of vehicle with steering angle ................. ................222

7-2 Performance measures of the eGator based guidance system for path navigation ..........242

7-3 U-turn performance measures .........__ ............ ...............244..

7-4 Switch back turn performance measures .............. ...............244....

8-1 Error of the vehicle for GPS based navigation experiments.............__ .........___......260











LIST OF FIGURES


FiMr page

1-1 Agricultural Robots............... ...............22.

1-2 Typical citrus grove alleyway ................. ...............24........... ...

2-1 Remote controlled vehicle (Murakami et al., 2004). ............. ...............27.....

2-2 Sensor arm configurations used to guide a crawler tractor (Yekuteli et al., 2002)............29

2-3 Overhead Guidance (Shin et al., 2002) ......___ .... ..____ ...............29.

2-4 Greenhouse sprayer with ultrasonic sensor (Singh and Burks. 2002) .............. ................32

2-5 Machine vision guidance example (Benson et al., 2001).. ................ .......................35

2-6 Machine vision guidance example (Rovira-Mas et al., 2002).. ............ .....................3

2-7 Ladar Mounted on top of the tractor (Yokota et al., 2004) ................. .......__. ..........3 8

2-8 UAV using pure pursuit (Enomoto et al., 2007) ........._._.._......_.. ....._.... .....4

2-9 Switch back turning implemented using spline functions (Noguchi et al., 2001)............. 50

3-1 Architecture of the tractor guidance system ................. ...............53........... ..

3-2 John Deere 6410 tractor used for autonomous vehicle guidance development .................54

3-3 Hydrostatic steering mechanism of the tractor ................. ..........._._..... 55...._ ..

3-4 Electro-hydraulic circuit for autonomous steering .............. ...............57....

3-5 Rear end of tractor showing the supply to the servo valve ................. .......................57

3-6 View under the cab showing the servo valve circuit .............. ...............58....

3-7 Servo valve showing the main parts (Source: Sauer Danfoss PVG 32 valve
documentation) .............. ...............60....

3-8 Video Camera, one of the primary path finding sensors ................. ................. ......61

3-9 Camera mounted on top of the tractor cab .....__.__ .... ...__ ...............62

3-10 Frame grabber board ...........__......___ ...............63...

3-11 Laser radar and its mount............... ...............64.











3-12 Stegmann HD20 Encoder .............. ...............64....

3-13 Output from the encoder shows two waves in quadrature ................. .......................65

3-14 Encoder mounted on the tractor axle .............. ...............65....

3-15 Instruments mounted in the tractor cabin............... ...............66.

3-16 M icrocontroller .........._.._ ......... ...............67.....

3-17 Amplifier Circuit............... ...............68

3-18 DGPS receiver mounted on top of the tractor cab ......____ ..... ... .......___......6

3-19 Inverter mounted in the tractor cabin ........._ ............ ...............70.

3-20 Inertial Measurement Unit used in this research............... ...............70

3-21 Voltage and time required to turn 100 degrees ................. ....___. ............. .....7

3 -22 Encoder calibration .........._.... ...............73..._.__. .....


3-23 Vehicle position plotted in ArcView (0.2Hz, 10mph) ................. ................. ...._..75

3 -24 Measuring gain from the position plot............... ...............76..

3-25 Open loop frequency response for 4 mph, 7 mph and 10 mph............_._. ........._._. ....76

3-26 Theoretical model for the three speeds .............. ...............78....

3-27 Simulink model for the vehicle dynamics .............. ...............78....

3-28 Sinusoidal response of the theoretical model and the vehicle (10 mph, 0.2 Hz) ..............79

3-29 Guidance control block diagram ....._._................. ...............79.....

3-30 Simulink model of the vehicle control system ................. ...............80..............

3-31 Simulated sustained oscillation with a gain of 1.8 for the 10 mph model ................... ......81

3-32 Simulated step response with the initial tuning parameters (10mph, Kp=1.08, Kd =
1.25) .............. ...............82....

3-33 Simulated step response with the final tuned parameters (10mph) ........._._... ................83

3-34 Camera and Ladar mounted on top of the tractor cabin. ........._.___..... ._ ..............84

3-35 Typical images seen while navigating through hay bales............... ...............85.

3-36 Image Coordinates .............. ...............86....











3-37 R, G, B color plots of hay bales (X axis: Intensity-Range: 0-255, Y axis: no. of
pixel s) ................. ...............87.................

3-38 R, G, B color plots of grass path (X axis: Intensity-Range: 0-255, Y axis: no. of
pixel s) ................. ...............87.................

3-39 Thresholded image ........._._.._......_.. ...............88....

3-40 Cleaned image............... ...............88.

3-41 Boundary Isolated image .............. ...............89....

3 -42 Lines fit to the boundaries............... ...............9


3-43 Machine vision algorithm flowchart ........._._. ...._... .. ...............92..

3 -44 Typical images for navigation in citrus grove .............. ...............93....

3-45 Picture collection in the grove .............. ...............94....

3-46 R, G,B intensity plot for ground and trees in grove alleyway .............. .....................9

3-47 Images and their green intensity profie across the center of the image. ................... ........96

3-48 Thresholding. A)Shadow image.. ............ ...............97.....

3-49 Classified images without and with shadows .............. ...............98....

3-50 Machine vision algorithm flowchart ................. ...............100........... ...

3-5 1 Calculating the required heading of the vehicle ................. ...............101........... .

3-52 Radial distance plotted against angle (180 degree at 0.5 degree increments) .................102

3-53 Ladar algorithm flowchart ................. ...............104......... .....

3-54 Serial Protocol ................. ...............105................


3-55 Illustration of Kalman fi1ter operation ................ ...............108..............

3-56 Vehicle in the path with the state vector variables ................. ................. ......... 11

3-57 Fuzzy logic implementation ................. ...............116...............

3-58 Membership function for linguistic variables ................. ...............117..............

3-59 Crisp output membership function............... ...............12

3-60 Illustration of inference for the example. ................ ................. ........ ........ 121











3-61 Fuzzy logic implementation to correct divergence ........._._.._......_.. ......._.._......124

3-62 Input membership function ........._._.._......_.. ...............124...

3-63 Output membership function ........._.._.. ...._... ...............126...

3-64 Simulation result of fusion of error obtained from vision and ladar algorithms .............127

3-65 Pure Pursuit steering method ..........._ ..... ..__ ...............130..

3-66 Illustration of experiment for steering angle calibration with radius of curvature ..........131

3-67 Line fitting for calibrating radius of curvature of vehicle with steering angle ........._......132

3-68 Flowchart of Pure Pursuit steering algorithm ................ ...............133........... ..

3-69 Image obtained from the camera when the vehicle is navigating the alleyway ...............134

3-70 Segmented images .............. ...............135....

3-71 Vision algorithm example for headland detection ................. .............................136

3-72 Use of boxes in the segmented images ................. ...............137.............

3-73 Image where the vision based headland detection can fail ................ ......................139

3-74 Ladar and motor assembly for sweeping.. ............ ...............140.....

3-75 Illustration of the ladar sweep ...........__......___ ...............140..

3-77 Swept ladar data in Cartesian coordinates .............. ...............143....

3-78 3-dimensional ladar data proj ected on the horizontal plane ...........__... ........__........144

3-79 Tree clustering and headland determination ....__ ......_____ ...... ..__ .........14

3-80 Illustration of vehicle navigating way points ....__ ......_____ ...... ...__ ........15

3-81 Angle ranges of steering and vehicle turn. ....__ ......_____ .......___ .........15

3-82 Path taken for translation of the vehicle to the way point. ....._____ ..... ....__ ..........1 54

3-83 Ladar data when ob stacle i s not present in front of the vehicle ...........__...................15 5

3-84 Ladar scan data with obstacle present in front of the vehicle............. ..__.........__ ....156

3-85 Illustration of the obstacle detection using ladar ..........._ ..... ..__ ................15

3-86 Flowchart of the obstacle detection algorithm ....__ ......_____ ...... .....__.......15











4-1 Software architecture of the tractor guidance system ................ ...._.._ ........._.....160

4-2 Dialog box for high level controls ................ ...............160........... ..

4-3 Software architecture of the eGator guidance system ................. .......... ...............166

5-1 Electro-hydraulic retrofit of the tractor for automatic guidance ................. ................. 175

5-2 Guidance system architecture of the vehicle ................. ...............176........... ..

5-3 Camera and ladar mounted on top of the tractor cab ................ ...........................176

5-4 Machine vision algorithm flowchart ................. ...............178........... ...

5-5 Machine vision results for citrus grove alleyway .......... ................ ................ ...17

5-6 Ladar algorithm flowchart ........._.._.. ...._... ...............180...

5-7 Radial distance measured by the laser radar in the hay bale path............... .................181

5-8 Open loop theoretical and actual frequency response of the sinusoidal vehicle
dynamics test (Phase = -180 deg) .............. ...............182....

5-9 Simulated vehicle control system block diagram .............. ...............183....

5-10 Simulated Im step response of the vehicle at 3.3 m/s .............. ...............183....

5-11 Guidance system test path. ................. ...............184............

5-12 Vehicle in the citrus grove alleyway ................. ...............186........... ..

5-13 Performance of the machine vision guidance in the straight path. ................ ...............188

5-14 Performance of the laser radar guidance in the straight path ................. ............... ....189

5-15 Performance in the curved path at 3.1 m/s. ........ ......... ................ ...............190

6-1 Fusion based guidance system architecture ................ ...............197..............

6-2 Vehicle in the path with the state vector variables .............. ...............198....

6-3 Kalman filter operation ................. ...............202...............

6-4 Fuzzy logic for sensor supervisor. .......... ............... ............ ...................204

6-5 Fuzzy logic for correcting divergence. ............. ...............206....

6-6 Simulation result of fusion of error obtained from vision and ladar algorithms .............209

6-7 Hay bale track profile............... ...............210











6-8 Citrus grove alleyways ........._._. ...._.._.. ...............211...

6-9 Field of View (FOV) of camera and ladar ................. ...............211........... .

6-10 Path navigation error of the vehicle in the test track. .......... ................ ................214

6-11 Path navigation error of the vehicle in the grove alleyway .. .............. ............. .......216

7-1 Pure Pursuit steering method .............. ...............220....

7-2 Illustration of experiment for steering angle calibration with radius of curvature ..........221

7-3 Calibration curve between radiuses of curvature of vehicle with the steering angle.......222

7-4 Flowchart of Pure Pursuit steering algorithm ................ ...............223........... ..

7-5 Images obtained from the camera when the vehicle is navigating the alleyway .............224

7-6 Segmented images .............. ...............225....

7-7 Vision algorithm example for headland detection ...._ ................. ................ ..226

7-8 Use of boxes in the segmented images .............. ...............227....

7-9 Sample image where the vision based headland detection can fail .............. .................229

7-10 Ladar and motor assembly for sweeping.. ............ ...............230.....

7-11 Illustration of the ladar sweep ...........__......___ ...............231..

7-12 Coordinate systems .............. ...............232....

7-13 Swept ladar data in Cartesian coordinates .............. ...............233....

7-14 3-dimensional ladar data proj ected on the horizontal plane ............___ .........__ ......234

7-15 Tree clustering and headland determination ....__ ......_____ .......___ ..........23

7-16 Vehicle in the test track. ..........._ ..... ..__ ...............238.

7-18 Alleyway of the citrus grove where experiments were conducted ........._.._... ........._.....240

7-19 Headland turning maneuvers in the grove.. ............ ...............241.....

7-20 Headland turning maneuver performance measures ......____ ... ..... ................. 243

8-1 Illustration of vehicle navigating way points .....__.....___ ........... ...........24

8-2 Angle ranges of steering and vehicle turn .....__.....___ ..........._ .........5











8-3 Path taken for translation of the vehicle to the way point..........._._... ........___..........252

8-4 Ladar data when obstacle is not present in front of the vehicle ........._._._..........._.......254

8-5 Ladar scan data with obstacle present in front of the vehicle..........._.._._ ........_._._.....254

8-6 Illustration of the obstacle detection using ladar ........._... ...... .___ ......._.._.......255

8-7 Flowchart of the obstacle detection algorithm ...._._._._ .... ... ..... ... .._._._........25

8-8 Waypoint location and vehicle's path for the GPS based navigation..............................25









Abstract of Dissertation Presented to the Graduate School
of the University of Florida in Partial Fulfillment of the
Requirements for the Degree of Doctor of Philosophy

AUTONOMOUS VEHICLE GUIDANCE
FOR CITRUS GROVE NAVIGATION

By

Vijay Subramanian

August 2008

Chair: Thomas F Burks
Major: Agricultural and Biological Engineering

Automation and robotics is becoming an important part of agricultural operations in the

21st century. Exploration is currently under way to use robotics in citrus groves. An important

part of robotics in citrus groves is the use of autonomous vehicles. The development of guidance

systems for an autonomous vehicle operating in citrus groves is presented in this dissertation.

The guidance system development is based on two different vehicles, a commercial

Tractor and a golf cart type vehicle, eGator. Initial developments were based on the Tractor and

the later developments were based on the eGator. The guidance system development process

consisted of choosing sensors for implementing the guidance system, the mounting of the sensors

and developing hardware and software interface for the sensors with the computer, the

development of algorithms for processing the information from the sensors to determine the path

for the vehicle, the development of steering control system to steer the vehicle in the desired

path; the development of the software for interfacing the sensors and implementing the

algorithms, and conducting experiments to validate the operation of the autonomous vehicle in

citrus groves.

The primary sensors used for determining the path for the vehicle were, video camera and

laser radar. The important methods discussed in this dissertation are the determination of the









vehicle dynamics of the tractor; the development of PID based steering control system for

steering the tractor machine vision, machine vision and laser radar based path segmentation

algorithms for navigating the vehicle through grove alleyways, sensor fusion based path

determination by fusing the information from machine vision and laser radar, Pure Pursuit

method of steering the eGator, algorithms for turning the vehicle at the headlands of the citrus

grove using sweeping ladar, GPS based navigation of open fields between grove blocks and laser

radar based obstacle detection.

The ability of the autonomous vehicle to navigate the paths was experimentally verified in

custom designed test tracks and in citrus groves. The development of the autonomous vehicle

guidance system showed machine vision and ladar based systems as promising and accurate

methods for navigating citrus groves. The autonomous vehicle possesses the ability to navigate

the citrus grove alleyways in the middle of the path, turn around at the headlands and navigate

subsequent alleyways, navigate open fields between grove blocks, and detect and stop in the

presence of obstacles.









CHAPTER 1
INTTRODUCTION

This dissertation is about the development of an autonomous vehicle guidance system and

methods for navigating citrus groves. The need for automation in citrus groves is first introduced.

The obj ectives of this work in addressing the need are then specified. A glance of the research

done by other researchers in similar areas is given. The details of the autonomous vehicle

guidance system and algorithm development are discussed. Finally, the results of the research are

presented and the areas of further improvement are indicated.

Florida Citrus

Citrus is a genus of flowering plants. For the common man, it includes orange, tangerine,

grapefruit, lime, lemon and their many varieties. Florida, the place where this work was done,

supplies more than 80% of citrus in USA and is today a $9 billion industry (Hodges et al., 2001).

There are also other maj or citrus producers in the world such as Brazil, Mexico, Israel and Italy.

For all these producers, there are large numbers of people in the world who consume citrus. The

market therefore, for citrus, is huge. Citrus area per farm in Florida was around 0.5 sq. Km in

1997 (Hodges et al., 2001) and is currently the state's largest agricultural produce. Florida is

also the world' s leading producer of grapefruit and is the second largest producer of oranges in

the world following Brazil. 95 percent of all Florida oranges are squeezed into orange juice

(Brown, 2002). The total economic impact of the citrus industry on the state of Florida exceeds

$8 billion annually.

Automation and Robotics in Citrus Groves

Over the years, large scale farms have been replacing the smaller ones. More people are

moving to other occupations. So, one farm owner replaces several owners. This leads to larger

fields to plant and harvest. Laborers often work for several weeks in a farm, so facilities have to










be provided for them to live near the orchard. This requires a good investment on the part of the

farm owner. In Florida, non-U.S. citizens are often employed for harvesting citrus, many of

whom face immigration problems. Therefore labor is scarce and decreasing. If the trend

continues, fewer people will be available to operate in the field. With large farms replacing

several smaller ones, the work that is required of the operator is also increased in scale. Longer

duration of operation is necessary to cover huge orchards. With longer duration of operation,

operator work is large and consistency in precision is sacrificed. The amount of work per person

is also expected to be more. It is also known that longer duration of operation can be detrimental

to the laborer' s health. For applications such as spraying, there is a risk of the operator being

exposed to the sprayed chemicals. This would affect the operator's health. Therefore there is a

great need for automation of agricultural practices.

A general definition of the term automation is according to the Oxford English dictionary,

"the use of electronic or mechanical devices to replace human labor". For the automation haters,

it must be quickly pointed out that automation is not for rendering people unemployed.

Automation is mostly to do the work that is not enj oyable or is dangerous or is difficult for

humans. For example, the automatic processing and packaging machines which operate in large

factories. There are as many varying definitions of robotics as there are researchers in the field of

robotics. A reasonable definition would be an automation possessing enough intelligence to

change its behavior in accordance with some level of change in its environment. The level of

change is generally not defined. Examples range from the well known walking humanoids to the

mars rovers. Today robotics technology provides intelligent systems that have the potential to be

used in many agricultural applications. Robotic systems are precise and can be used repeatedly.

The state of the art technology can provide machines with precision control and intelligence for









automatic agriculture using autonomous off-road vehicles. The use of robots is more common

today than ever before and it is no longer exclusively used by the heavy production industries.

The agricultural industry is behind other industries in using robots because the sort of jobs

involved in agriculture are not straight forward and many repetitive tasks are not exactly the

same every time. In most cases, many factors have to be considered (i.e., size and color of the

fruit to be picked) before the commencement of a task. Robotics is usually associated with the

manufacturing industry. However certain similarities between the manufacturing industry and

agriculture make robotics a good option for agriculture. These factors include decreasing labor,

foreign competition, rising costs, hazardous environment for the laborer and the need to invest in

technology to improve productivity. On the other hand, there are quite a few differences that

make robotics difficult to be used in agriculture. These factors include the variability in the

environment, seasonal and changing crops. Robotics is currently being tested in several areas of

agriculture, for example, picking fruits in orchards, spraying crops in greenhouses, animal

milking and shearing, planting and harvesting crops, food packaging, disease detection, mapping,

etc. Fruit picking robots and sheep shearing robots are designed to replace difficult human labor.









A LIE B

Figure 1-1. Agricultural Robots. A) Fruit picking robot (University of Florida). B) Sheep
shearing robot (University of Western Australia).

In the 2001-2002 season, about 6000 ha of Florida' s citrus groves were mechanically

harvested (Brown, 2002). The operations performed in a citrus grove include preparing the soil,










planting the trees, irrigation, spraying fertilizers and pesticides, monitoring the grove and trees,

and harvesting the fruits.

Autonomous Vehicles for Citrus Groves

A major achievement in agricultural robotics would be a vehicle capable of moving

through the entire Hield on its own. The human operator would only be required to supervise the

autonomous operation. Since these vehicles can work continuously, they are very efficient.

These vehicles could open a new market in agriculture. A proj ect on an autonomous vehicle not

only benefits agriculture, but also other areas, such as military operations, rescue operations and

use in hazardous environments. Since these vehicles are expected to work in challenging

environments independent of the operator, they have to rely on their sensors and navigation

system. Sensor electronics and computing facilities must be integrated with the mechanical and

hydraulic parts. The present technology in sensors and navigational equipment is advanced

enough to realize a vehicle capable of autonomous operation. The economic viability has

improved significantly over the last decade.

Fully autonomous technologies are often seen as being too expensive to justify their use

compared to conventional machine systems. This suggests that automation technology should be

integrated into conventional machines. The increasing availability of low cost sensors is

encouraging. Over the course of time, the fully automated technology could be expected to

slowly replace the existing semi-automatic equipment. Tractors are the workhorses of the

modern farm. By automating these machines, the productivity can be increased, safety can be

improved, and costs could be reduced for many agricultural operations. An article in the Institute

of Food and Agricultural Sciences (IFAS) extension website at the University of Florida reports

that mechanical harvesting is the future for the citrus industry (Rouse and Futch, 2005). In citrus










groves, trees are planted in rows with alleys between the rows. The usual alley width is about 2. 1

to 2.4m and tree heights vary from 4.5 m to 6 m depending on their age (Brown, 2002).











Figure 1-2. Typical citrus grove alleyway

This research is aimed at developing an autonomous vehicle capable of maneuvering

through the alleyways of citrus groves, turn around after reaching the headlands and navigate

other alleyways and finally move from one grove block to another. Such a vehicle is expected to

be used for robotic applications in citrus groves, such as acting as a guiding vehicle for robotic

harvesting carrying several robotic arms, scout vehicle to look for diseased trees, guiding vehicle

for spraying nutrients and pesticides, guide vehicle for mowing the grass in the alleyways and

carrying sensors for operations such as disease detection. The research aims at adding on sensors

to an existing commercially used vehicle and modifying the vehicle's steering to be operated

autonomously. The cost for integrating the technology to existing machines is expected to be

more attractive for a grower to adopt the technology. There are indications that operating at night

may also provide benefits. For example, spraying and harvesting operations could be completed

at night. However, development of a system operable at night requires research beyond the scope

of this dissertation. In the literature, there has been much research in crop harvesting vehicles,

but citrus harvesting is a relatively unexplored area in vehicle guidance. To date, there has been

minimal success in developing commercial autonomous navigation systems for citrus grove

applications.










Obj ectives

The primary obj ective of this work was to develop an autonomous vehicle guidance system

for navigating citrus groves. The following sub-obj ectives were also identified:

* Develop machine vision based path segmentation method for navigating the grove
alleyways

* Develop laser radar (ladar) based path segmentation method for navigating the grove
alleyways

* Retrofit the steering system of the vehicle for computer control

* Interface the sensors and computers with a commercial vehicle for making the vehicle
autonomous

* Develop steering control algorithm for steering the vehicle and develop a mathematical
model of the vehicle if required

* Develop sensor fusion method to fuse the information from different sensors to improve
reliability in navigation

* Develop headland detection algorithm and implement headland turning maneuvers to
navigate the grove headlands after navigating each row

* Develop obstacle detection and halting ability using laser radar and DGPS based open Hield
navigation method to move between citrus grove blocks









CHAPTER 2
LITERATURE REVIEW

Agricultural Autonomous Navigation Applications

Vehicles have been used in agriculture for many decades. They have been used for

applications such as plowing, planting, harvesting and spraying. These operations require the

operator to spend long and tedious hours in the field. Modern technology has facilitated the

automation of several agricultural operations thereby reducing operator fatigue and improving

productivity. An important part of this automation process includes the use of autonomous

vehicles.

Research has been conducted in developing autonomous vehicles for agriculture for many

years. Some of these researches are mentioned below. Many of these researches were conducted

in research laboratories and have not yet fully evolved into commercial products. The reasons are

numerous ranging from the economics to insufficient computer processing power. However, the

advancements in computer processing power and world economics pertaining to agriculture in

recent years have given a new hope to bringing autonomous vehicle from research laboratories to

commercial agricultural operations. Most of the research into developing autonomous vehicles in

agriculture is concentrated on vehicles for crop operations. Research in the development of

autonomous vehicles for orchards and citrus groves are few.

Guidance Vehicle

The autonomous vehicle can either be in the form of guidance system added on to a

commercial agricultural vehicle or a vehicle developed exclusively to aid in autonomous or

robotic operations. Blackmore et al. (2002) have listed the following requirements of an

autonomous vehicle for agricultural applications

* Small in size
* Light weight










* Exhibit long-term sensible behaviors
* Capable of receiving instructions and communicating information
* Capable of coordinating with other machines
* Capable of working collaboratively with other machines
* Behave in a safe manner, even when partial system failures occur
* Carry out a range of useful tasks

These requirements are also applicable to vehicles navigating citrus groves. While current

research has not progressed to the level of meeting all these criteria, there has been significant

progress. In addition to these criteria, the following requirements may also be expected

* Low cost
* Attractive for a farmer to adopt
* Off-road vehicle performance and reliability

An autonomous vehicle is expected to possess the following basic components to be

operational: vehicle platform, steering system, guidance system, sensors and a control system.

Guidance by Remote Control

Remote control or tele-operation has already been used in several branches of robotics.

Visual information of the operating environment is relayed to a remote location where the

operator controls the vehicle. Murakami et al. (2004) used a CCD camera, a gyroscope and a

GPS as the sensors and wireless IP protocol for communication, for tele-operation of an HST

drive agricultural vehicle. A joystick was used for tele-operation.










A B
Figure 2-1. Remote controlled vehicle (Murakami et al., 2004). A) Teleoperated Vehicle. B)
User interface.









The maj or advantage of using a tele-operated vehicle would be the comfortable

environment of operation and safety. Another minor advantage compared to the direct guidance

would be that this is relatively simple and cheap. A maj or challenge in using tele-operated

vehicles is the time delay in communication. For vehicle guidance, no significant advantages are

expected. There is neither much reduction in labor cost nor significant reduction in the amount of

work that a driver is required to do. There is also limited possibility of automation.

Direct Guidance

In this method, the vehicle possesses the ability to control its path. This has the potential

for automation and is also economically justifiable on large Hields. But the technology is

complex. The lack of awareness about the technology and safety concerns as compared to remote

control is a cause of apprehension to a farmer. However, present day technology and

demonstrations of safe control of the guidance could overcome these limitations.

Guidance Using Arm

Guidance by mechanical contact has been widely used by earlier researchers in agriculture.

This method is simple and straightforward. Often, a mechanical arm senses the crop boundary

and the vehicle is moved at a distance from the crop. However if a bare region is encountered in

the contact environment, the control is lost. Multiple mechanical contacts have been used to

overcome this limitation with moderate success. A maj or concern in using this method is that

there is great potential for the contact element to damage the crop.

Yekutieli and Pegna. (2002) used a sensor arm to guide a crawler tractor through a

vineyard. The arm sensed the rows and determined the position of the tractor. The arm did not

cause any damage to the vines. This method was suitable for a controlled environment like a

vineyard where the rows were exactly parallel to each other.



























Figure 2-2. Sensor arm configurations used to guide a crawler tractor (Yekuteli et al., 2002)

Overhead Guidance

In this method, an overhead guide or a track guides the vehicle using a mechanical link.

Shin et al. (2002) used overhead rails to direct a vehicle through an orchard. The vehicle was

track type and was used for spraying applications.

The vehicle's performance was acceptable in straight and curved paths, but the vehicle

over-steered at higher speeds. This system was only feasible for low speeds of about 0.4 m/s.

Also the installation cost of overhead rails is very high. For a grower to adopt a new technology,

the high cost of constructing large structures is very discouraging. Hence such a system is not

suitable for this proj ect.

Overhead Guide











Orchard
Sprayer



Figure 2-3. Overhead Guidance (Shin et al., 2002)









Guidance on Tracks and Leader Cables

This type of guidance is widely used in industrial transportation. It was used in very early

researches in agricultural vehicle guidance. In the system of leader cables, the cables carry an

AC signal. The coils on the vehicle base detect magnetic fields when the vehicle moves over the

cable. The system works very well in orchards. The drawback is the permanent installation of

large structures. This also increases the initial cost and these structures require maintenance over

time. There is also little flexibility in the crop grown. Fertilizer application and irrigation require

care so that they do not damage the tracks. The drawbacks of the mechanical contact methods

and the advancement of sensor technology have drawn researchers to the non-contact methods.

Distance Metering and Triangulation Based Guidance

In this method, the sensors operate based on signals received from sources installed in the

environment for guidance purpose. This popular method places markers throughout the field that

a range detector can identify. One method employs a photoelectric sensor for the detection of

light from reflectors and measured angles between reflectors. The position of the vehicle

equipped with the photoelectric sensor is calculated by triangulation. The reflectors are placed in

the environment at regular intervals for detection. Another method involves using a laser beam,

which the vehicle follows. A laser beam hits a sensor mounted on the vehicle. The beam has to

be constantly moved to guide the vehicle.

The above methods all require the expensive installation of structures on the field. They

also assume that the crops are grown in some order and that they do not block the field of view

of the sensor. Rapid development of sensor technology makes it possible to guide vehicles using

information from state of the art sensors that require no modification of the environment. These

are the popular guidance techniques in the first decade of the 21s~t century.









Guidance by Dead Reckoning

Dead Reckoning is the process of estimating the position of a vehicle by advancing to a

known position using course, speed, time and distance to be traveled. This technology is often

used in military and marine navigation. The sensors often used are wheel encoders, which

measure the rotation of the wheels. Knowledge of the circumference of the wheel gives the

distance traveled. Rotary encoders are being used to detect the wheel position for position

location (Kodagoda et al.; Nagasaka et al., 2002). Dead reckoning, a widely used method for

positioning unmanned vehicles, has not been a very viable option for agricultural applications

due to the uneven soil conditions, which facilitate slipping. Often, in wet weather there is

sufficient slip in the tractor wheels. The round off errors also increases causing significant

incorrect distance measurements. The above two factors introduce errors in calculating distance.

Hence, using an encoder as odometer for finding distance has not been reliable.

Guidance Using Ultrasonic Sensors

Ultrasonic sensors send a high frequency sound wave. The reflected wave is received at the

transducer and based on the time of flight, the distance is calculated. Ultrasonic sensors have

been very popular for sensing distance and are able to measure tree canopies, while the vehicle

travels at a speed of 1.8 m/s with an accuracy of 1 to 3 cm (lida and Burks, 2002). The accuracy

was found to be higher with lower speeds.

These sensors have a requirement for the target to be perpendicular to the sensor for the

waves to be reflected back properly (Burks et al., 2004). However, when used in groves, the

environment is unstructured and so such surfaces cannot be expected. Hence their use is limited.





















Figure 2-4. Greenhouse sprayer with ultrasonic sensor (Singh and Burks. 2002)

Wireless/Sensor Networks

Sensor networks are a relatively new development for vehicle positioning. A set of sensors

is located at several places along the path and they communicate with the vehicle's computer to

decide the path to be traveled. This requires permanent positioning of sensors at regular intervals

on the Hield. This is similar to using a beacon or landmark, but is not advisable in an orchard,

since the foliage may block the field of view. Wireless LAN has been tried as a method of

communication between the vehicle and base station to get feedback on position accuracy

(Nagasaka et al., 2002). Microwave systems can perform the same j ob effectively (Matsuo et al.,

2002), but they require frequent calibration. Wireless LAN could be used more effectively for

real time path planning. An operator sitting in the office could send information about the next

alleyways to be traversed.

Guidance Using Inertial Sensors

An inertial measurement unit, or IMU, usually consists of six inertial sensors- three linear

accelerometers and three-rate gyros. These sensors combine to give the vehicle's pitch, roll and

yaw. Often a magnetic compass is also included in the system to locate the direction.

Geomagnetic direction sensor (GDS) was the direction tracker when GPS was not available.

GDS has given errors due to the high magnetic Hield present around the vehicle (Mizushima et al,










2002). Therefore proper shielding is necessary. Mizushima et al. (2002) used an adaptive line

enhancer (ALE), which is an adaptive filter method, to eliminate noise in GDS.

Gyros have been widely used for inclination measurements (Mizushima et al., 2002). Fiber

optic gyro (FOG) has been reported to have given the best performance among the different

types of gyros for positioning (Nagasaka et al., 2002). IMUs are widely being used for vehicle

guidance research at present. With the combination of RTK GPS and FOG, accuracies of up to

5cm have been achieved (Noguchi et al., 2002). Accelerometers and inclinometers (Matsuo et

al., 2002) have also been tried with fairly positive results. At present gyros and inclinometers are

available together as inertial measurement units (IMU) for pitch, roll and yaw and linear velocity

measurements. Inertial measurements are very useful for navigating the vehicle in uneven

grounds, as encountered in groves.

GPS Based Guidance Systems

Guidance using GPS has been the preferred method of many researchers in agriculture for

guiding autonomous vehicles on crop rows. Some of the commercial developments have also

concentrated on using this method. Global Positioning System (GPS) receivers locate four or

more of the GPS satellites and calculate the distance to each, using this information to deduce

their own location. GPS, in combination with inertial navigation systems, have been promising

positioning sensors. Both Real Time Kinematic (RTK) GPS and Differential GPS (DGPS) have

been satisfactory, allowing real time measurement. However there is a tradeoff between accuracy

and cost in the selection of DGPS and RTK GPS receivers, with the latter being more accurate

but also more expensive. Nagasaka et al. (2002), Benson et al. (2001) and Noguchi et al. (2002)

have found that RTK GPS receivers give very accurate results. Accuracies of up to 7 cm with

DGPS (lida and Burks. 2002) and up to 2 cm with RTK GPS receivers have been reported. More

accuracy has been observed when the vehicle is stationary or is at slower speeds than at higher










speeds. Stombaugh et al. (1999) have found that the location of a GPS receiver when used as a

position sensor is critical for calculating the phase in frequency response tests of the control

system. They found that mounting a GPS receiver above the front wheels gives more accurate

navigation, whereas most research results report mounting a GPS receiver on top of the cab.

GPS based guidance alone cannot be used for positioning in citrus applications, as it could

give errors when the vehicle moves under tree canopies, which block the satellite signals to the

GPS receiver. In groves, trees could block the satellite signals. Moreover, a system using GPS

for guidance requires that a predetermined path be given to the computer for it to follow. This

requires initial mapping. Hence, when a GPS receiver is the sole sensor for positioning, it cannot

be used instantly in any grove. Significant time has to be spent in mapping its path.

Machine Vision for Autonomous Navigation

Machine vision is the ability of a computer to "see." A machine-vision system employs one

or more video cameras for obtaining images for the computer to interpret. Machine vision is one

of the popular sensing technologies for robotics applications today. Machine vision technology

can be used to automatically guide a vehicle when the path to be traversed is visually

distinguishable from the background. Vision guidance has the advantage of using local features

to adjust the vehicle navigation course.

The vision system's performance in guiding the vehicle is comparable to highly accurate

laser radar (Burks et al., 2004). Typical applications in vehicle guidance include guiding tractors

for crop cultivation and guiding an autonomous vehicle in a greenhouse. This sensing system has

been found to work well for fully grown crops, but it has not performed well for crop harvesting

with low or sparse crops.












Low Head





A B









Figure 2-5. Machine vision guidance example (Benson et al., 2001). A) Camera mounted on top
of the combine cab. B) Crop image. C) Processed image.

The vision system's reliability reduces with reduction in lighting, presence of dust and fog.

Benson et al. (2001) overcame this problem by using artificial lighting, but they experienced

vehicle shadow problem. They could achieve speeds of up to 1.3 m/s with an accuracy of 0.6 cm.

Positioning of the camera is also an important factor for the performance of the vision system.

Mounting the camera on the top front of the cab of the tractor, pointing downwards is an optimal

position for many agricultural applications. NIR filters can be used to process images from a

CCD camera (Nishwaki et al., 2002). This sometimes gives better information depending on the

application.

Vision involves many complicated algorithms for image processing and recognition. It

therefore increases the amount of computation required, resulting in dedicated processors being

used. However, as faster computers are developed, this concern will no longer prevent them

being used for sensing.



















A B











Figure 2-6. Machine vision guidance example (Rovira-Mas et al., 2002). A) Camera mounted on
the front of the tractor for detecting crop rows. B) Thresholded image. C) After
Hough transform.

With computer vision, there is always the need for some physical feature or color

difference for the vision system to be able to sense effectively. A number of image processing

techniques have been investigated to find the guidance course for different applications. Han et

al. (2002) developed a row segmentation algorithm based on k-means clustering to segment crop

rows. This information was used to guide the vehicle. Stereo vision has been used for navigating

through an orchard (Takahashi et al., 2002). This method used two cameras to obtain the stereo

information and required considerable time for processing the images. Vision algorithms based

on Hough transform and blob analysis has been used for row crop guidance (Rovira-mas et al.,

2002) with positive results. Gerrish et al. (1997) used the ratio of the individual primary colors to

the sum of the primary colors to segment images and eliminate shadow problems with success.

Often hue, saturation and intensity values are used for thresholding (Morimoto et al., 2002),

which has worked well in several applications. Computer vision has been used in several









researches. In practice, many problems exist in the real environment in which light conditions

change often and contrasts shift, as is often the case in agriculture.

Laser Radar (Ladar) for Autonomous Navigation

The operating principle of non-contact laser radar is based on time-of-flight measurement.

The ladar calculates the distance to the obj ect using the time of flight of pulsed light, i.e. the

length of time between sending and receiving the beam of light. An extremely short pulse of

light (infrared laser beam) is transmitted towards an obj ect. Part of the light is then reflected back

to the unit, a fraction of a second later. A rotating mirror deflects the pulsed light beam to many

points in a semi-circle. The precise direction is given by an angular sensor on the mirror in the

ladar. A large number of coordinates measured in this manner are put together to form a model

of the surrounding area's contours. A planar scanning ladar gives information of one plane. This

can be overcome by rotating the laser source to get a 3 dimensional view. Laser range sensors are

very accurate in measuring distance, with accuracy as high as 1 cm over a distance of 80 m. This

makes them suitable for guidance applications. Laser radar has been used for ranging and

obstacle avoidance. It has higher resolution than ultrasonic sensors, so it is more accurate and

requires less computation than vision. However its performance degrades with dust and rain, like

vision, and it is more expensive than an ultrasonic sensor.

Carmer and Peterson. (1996) have discussed the use of laser radar (ladar) for various

applications in robotics. Autonomous vehicle navigation was discussed as a promising

application for ladar, due to its ability to accurately measure position. Gordon and Holmes.

(1998) developed a custom built ladar system to continuously monitor a moving vehicle's

position. The system's performance was not very reliable due to underdeveloped technology at

that time. Current technologies have advanced to the point that ladar systems are readily

available, although they are relatively expensive compared to other alternatives. Ahamed et al.










(2004) used a ladar for developing a positioning method using reflectors for infield road

navigation. They tested differently shaped reflectors to determine the accuracy in positioning.

Ladar can serve the dual purpose of mapping the tree canopies simultaneously along with

navigation. This could be used for grove mapping applications. One such application is for a

harvesting robot arm.











Figure 2-7. Ladar Mounted on top of the tractor (Yokota et al., 2004)

Yokota et al. (2004) had a ladar mounted on top of a tractor and generated local maps of

the surroundings and integrated it with GPS data. Ladar has been used for navigating a small

vehicle through an orchard (Tsubota et al., 2004). Guidance system using the ladar was found to

be more stable than using a GPS.

Intelligence for Autonomous Navigation

Artificial Intelligence (AI) could be useful in solving complex tasks in which the solution

is ambiguous or large computations are necessary. A popular use of AI in robotics is the heuristic

search, often used in path planning. Heuristics or empirical rules could be used to shorten the

search, as the number of solutions could be very large. To reach an optimal level of autonomous

operation, the robot should know how to deal with uncertainties in the environment. Robots

should be capable of changing their behaviors independently. Artificial intelligence can

immensely aid in this endeavor.









A knowledge-based system could be used in assimilating mission information with internal

models to supervise navigation (Speigle et al., 1995). A multiple layer path planning combined

with a guidance system using artificial intelligence is presented in Ortiz (1993). A combination

of global and local navigation schemes is discussed. Voronoi diagrams have been used in this

research for navigating different terrain features. Global navigation was used to establish a path

from the source to a destination based on fixed obj ects in the path and other moving vehicles. A

local navigation is also used to ward off immediate obstacles and does not take into account the

location of the destination. This approach is similar to one used by a human driver and the

guidance is expected to perform well.

Vehicle Dynamics Modeling and Steering Control

Steering control is a maj or factor for accurate guidance. The controller gets the error

information from the sensors and computes or decides the signal to be given to the steering

actuator, thereby steering the vehicle to get back to the correct position in the path.

Vehicle Modeling

Adequate modeling of the vehicle is often required for efficient control. A model of the

vehicle dynamics or steering gives the option of conducting simulations of the vehicle while

developing or tuning the steering controllers. Sophisticated models covering more details of the

vehicle's behavior can enable software simulations of the vehicle to be developed for studying

the vehicle's behaviors for different terrain changes. Models can help understand the stability of

the control system being developed. The most common model for describing vehicles is the

bicycle model. It is given by the equation










Y(s)
d(s)

CsfV 2(mLds +Izz)s2 + CSf~'srLVy(ds +L2)s +CsfCsrLVy2
s2 IZz mVY2S2 +VY(Csf + Csr) + m(Cs)11 2 + CSrL2 2))S + (mVy2(CSrL2 -Cs)11) + CsfCsrL2 )

(Eq. 2-1)

where

Csf = Cornering stiffness of front wheels

Csr = Cornering stiffness of rear wheels

d = Distance of travel

ds = Distance between position sensor and center of gravity

lzz = Yaw moment of inertia

L = Wheel base,

L2 = Distance from center of gravity to rear axle

L1 = Distance from center of gravity to front axle, along vehicle axis

m = Vehicle mass

V = Vehicle forward velocity

y = Lateral position of vehicle.

This is a Single input single output (SISO) model with steering angle input and lateral

position output. This model takes into account the tire slip effects. The assumptions made are

that the forward speed is constant and that lateral forces are proportional to side slip angle.

Alleyne et al. (1997) found that the bicycle model is easier to implement because it is a

linear control system. However the model fails at higher speeds due to higher lateral

accelerations. Stombaugh et al. (1999) showed that a classical model-based controller could

guide a two-wheel-drive tractor to within 16 cm of the desired path at high speeds, as well as that









a double-integrator transfer function can adequately model lateral deviation of the vehicle to

steering angle. They also found the dead band of the steering valve to be crucial for control

system design. Zhang et al. (2004) used the bicycle model to model the dynamics of the tractor.

Kise et al. (2002) modified the bicycle model to account for nonlinearities, including the slip and

cornering forces thereby obtaining a non-linear equation. The vehicle was guided using an

optimal controller. Qiu et al. (1999) have modeled an electro-hydraulic steering system using

single rod double action steering cylinders and other components for a Case-1H Magnum tractor.

The model was developed for a wheel type agricultural vehicle. Computer simulations and field

tests validated the model. To sidestep the complexity of modeling the vehicle and then designing

the control system, some researchers have used non-modeled techniques, for example, neural

networks and non-modeled PID control systems.

Neural networks have also been used for modeling the vehicle dynamics (Nijhuis et al.

(1992). A Neural network has the advantage of not having complex equations for the model. It

also handles the non-linearities. However designing traditional control systems is not

straightforward for such a model.

Steering Control

A steering controller is used for controlling the vehicle steering. The input to the

controller can be of various forms such as the lateral error of the vehicle in the path, the position

of the vehicle at some future instance, the required steering angle or the curvature the vehicle has

to follow. The steering controller can take the input and turn the steering to achieve the desired

result while continuously monitoring the steering changes. Several types of steering controllers

have been used in the literature by various researchers.









PLC control

PLC (Programmable Logic Controller) is a highly reliable special-purpose computer

widely used in industrial monitoring and control applications. PLCs typically have proprietary

programming and networking protocols, and special-purpose digital and analog I/O ports. A PLC

uses programmed logic instructions to control banks of inputs and outputs which interface timed

switch actuation to external electro-mechanical devices. PLCs are very fast at processing discrete

signals (like a switch condition). PLC has been used for steering control (Nagasaka et al., 2002),

controlling all the actuators in the vehicle. The PLC control loop took 2ms to execute. However

PLC has not received as much attention as the other control systems.

ON/OFF control

As the name implies, the control signals are turned on and off to achieve the required

control. Yekutieli et al. (2002) experimented with an ON/OFF control to guide a crawler tractor

through a vineyard. They found this control to work fast, but also observed a lot of overshoot. A

much smoother control could be achieved with more sophisticated control systems.

PID control

The PID (Proportional Integral Derivative) control algorithm is used for the control of

almost all loops in the process industries, and is also the basis for many advanced control

algorithms and strategies. PID has given satisfactory performance in tractor guidance.

Feedforward + PID control worked well for guiding a tractor through crop rows (Kodagoda et

al., 2002). Some time delay has been observed when using PID. However, overall, PID control

has performed better than proportional and PI control systems. PID control has been used for

guiding a grain harvesting tractor (Benson et al., 2003). In that research, PID was used to

calculate the actuator command signal based on the heading offset. The performance of the

controller was comparable to that of manual steering.










Adaptive control

Nonlinearity is observed in electro hydraulic steering valves. PID control is unable to

handle nonlinearities effectively. Tsubota et al. (2004) compared the experimental results from a

fuzzy adaptive controller and a neural network-based adaptive controller to those from a classical

PID controller in terms of velocity tracking. Experiments were conducted on a test bed and it

was found that both adaptive control algorithms compensated the non-linearity effectively.

Fuzzy logic

Fuzzy logic attempts to create artificial boundaries for analog states, which makes it

discrete enough for a digital computer to process. The boundaries are made fuzzy so that the

controller does not appear discontinuous. The processing stage usually consists of a set of logic

rules defined by the designer. Fuzzy control has been used to guide a tractor to follow crop rows

(Benson et al., 2001). The results were comparable with PID. Senoo et al. (1992) have pointed

out that the fuzzy controller could achieve better tracking performance than the PI controller.

They used a vehicle that tracked a magnetic tape on the floor. It has wider adaptability to all

kinds of inputs. Qiu et al. (1999) verified that the fuzzy steering control provided a fast and

accurate steering rate control on the tractor. Kodagoda et al. (2002) designed fuzzy PD and fuzzy

PI controllers to navigate an electric golf cart. These controllers were compared with the

traditional PID controllers, and were found to be insensitive to load fluctuations. They

determined that fuzzy control was better than PID for longitudinal control. PID was also found to

have large chatter and high saturation.

A combination of fuzzy and PID control holds a lot of promise (Burks et al., 2004). While

designing these controllers, the designer has to anticipate all the situations that the vehicle might

encounter. This can be achieved most of the time with a careful design and the experience of the










designer. However, if the transformation between different scenarios is not properly accounted

for, the controller could be discontinuous.

Neural networks

An Artificial Neural Network (ANN) is an information processing paradigm that is

inspired by the way biological nervous systems, such as the brain, process information. It is

composed of a large number of highly interconnected processing elements (neurons) working in

unison to solve specific problems. ANNs, like people, learn by example. They can learn

adaptively by the way they are trained. Nijhuis et al. (1992) were successful in using neural

networks for collision avoidance. The vehicle used an infrared range sensor to detect obstacles.

But neural networks have the inherent disadvantage of learning only what the driver does; hence

they are not robust. To work well, they have to be trained for every possible situation that will be

encountered.

Behavior-based control

Behavior-based control is a new development which has been successfully used in small

mobile robots. It works on the basis of addressing the most appropriate next action in a particular

situation. This type of control is distributed among a set of behaviors, in which each behavior

takes care of one aspect of control. Sensors trigger these behaviors. There is the inherent

necessity to coordinate different behaviors for proper operation. Behavior-based systems, in

combination with real time control system (RTCS), are expected to do well in vehicle guidance.

Optimal control

Kise et al. (2002) developed an optimal controller to guide a tractor. The tractor used an

RTKGPS and an IMU for sensing position. They compared the optimal control with PI control.

The optimal controller could perform high-speed guidance more precisely than the PI controller.

The performance was acceptable in 90-degree turns also.









Non-linear control

A new field of non-linear control is emerging, particularly for controlling mobile robots.

Earlier, researchers used linearized models of non-linear phenomenon. Now, with the availability

of fast processors, numerical solutions of non-linear equations can be solved quickly. Therefore

non-linear control systems are being developed for improved control. However, research is

needed before being widely adopted.

Sensor Fusion Applications in Navigation

Sensor fusion is a maj or component of sensor integration, merging multiple inputs with a

common representation. Multi-sensor environment generates much data with different

resolutions. These are often corrupted by a variety of noise, which continually vary because of

changes in the environment. Sensor fusion would extract meaningful information from these data

to make the optimal decision for navigation. Several methods, such as Dempster Schafer theory

and mapping learning methods, have been used for sensor fusion. Kalman filters have been the

ideal candidates for sensor fusion and have been widely used. Several methods for fusion have

been reported in the literature. These include, but are not limited to, neural networks (Davis and

Stentz, 1995; Rasmussen, 2002), variations of Kalman filter (Paul and Wan, 2005), statistical

methods (Wu et al., 2002), behavior based methods (Gage and Murphy, 2000), voting, fuzzy

logic (Runkler et al., 1998) and combinations of these (Mobus and Kolbe, 2004). The process of

fusion might be performed at various levels ranging from sensor level to decision level (Klein,

1999). Rasmussen (2002) used neural network for combining image and laser data to segment

road ways from background obj ects. He fused features such as color, texture and laser in various

combinations using neural network to test various fusion combinations. He found that

combinations such as these performed better at segmenting roads compared to individual models.

Wu et al. (2002) used Dempster Schafer theory for sensor fusion to combine images and sounds










to interpret human behavior. This probabilistic approach was appropriate for such non-

deterministic applications. However, it is felt that probabilistic approaches may take away the

accuracy of deterministic applications if accurate statistical models of the process are not created.

Runkler et al. (1998) thought Kalman fi1ter to be linearising non-linear models and they

developed fuzzy models of signals. Then they proj ected signals on to these models to fuse the

data. Their method helped in reducing noise in their experimental data by using this method of

fusion.

The Kalman fi1ter has been the subj ect of extensive research in navigation. It is a set of

mathematical equations that provides an efficient means to estimate the state of a process, by

computation. It can give an accurate estimate of the present, past and future states of the system.

It is useful even when the precise model is not available. The Kalman filter is often used in

vehicle navigation for processing sensor signals. Information from different sensors is processed

using the Kalman fi1ter to compute reliable position information of the vehicle.

In many research, the sensor measurements can be fairly noisy. So the choice of method

could also help in reducing the noise and also aid in fusing the information from the sensors.

Methods such as neural network, behavior based methods; fuzzy logic and voting are not widely

used primarily for reducing noise. Kalman filtering is a widely used method for eliminating

noisy measurements from sensor data and also for sensor fusion. Kalman filter can be considered

as a subset of the statistical methods because of the use of statistical models for noise. Paul and

Wan (2005) used two Kalman filters for accurate state estimation and for terrain mapping for

navigating a vehicle through unknown environments. The state estimation process fused the

information from three onboard sensors to estimate the vehicle location. Simulated results

showed feasibility of the method. Han et al. (2002) used a Kalman filter to filter DGPS










(Differential Global Positioning System) data for improving positioning accuracy for parallel

tracking applications. The Kalman filter smoothed the data and reduced the cross tracking error.

Based on the good results obtained in the previous research for fusion and noise reduction, a

Kalman filter was selected as the method to perform the fusion and filter the noise in sensor

measurements.

The use of a Kalman filter with fixed parameters has drawbacks. Divergence of the

estimates is a problem which is sometimes encountered with a Kalman filter, wherein the filter

continually tries to fit a wrong process. Divergence may be attributed to system modeling errors,

noise variances, ignored bias and computational round off errors (Fitzgerald, 1971). Another

problem in using a simple Kalman filter is that the reliability of the information from the sensors

may depend on the type of path. Therefore, if a white noise model is assumed for the process and

measurements, the reliability of a sensor in the Kalman filter has to be constantly updated.

Abdelnour et al. (1993) used fuzzy logic in detecting and correcting the divergence. Sasladek and

Wang (1999) used fuzzy logic with an extended Kalman filter to tackle the problem of

divergence for an autonomous ground vehicle. The extended Kalman filter reduced the position

and velocity error when the filter diverged. The use of fuzzy logic also allowed a lower order

state model to be used. Sasladek and Wang (1999) used fuzzy logic to reduce divergence in the

Kalman filter and presented good simulation results.

Pure Pursuit Steering Control

Pure pursuit is a method for steering vehicles (land, sea or air). In this method, the vehicle

is deemed to follow or pursue a point in front of the vehicle and therefore steered such that the

vehicle's nose points to the point being pursued. Pure pursuit is similar to human driving as

humans tend to drive towards some imaginary point in front of the vehicle at different instances

i.e. pursuing a point. Therefore it is very intuitive to implement in an autonomous vehicle. In this










method, a look-ahead point is chosen and an arc is constructed joining the present vehicle

location and the look-ahead point. Then the vehicle is steered to follow the arc. The method has

been in use for several years in different guidance systems. Some of the early work in using pure

pursuit for autonomous vehicle steering was in the Carnegie Mellon Terragator and NavLab

projects (Coulter, 1992).

Pure pursuit guidance has been used for unmanned aerial vehicles (UAV) by Enomoto et

al. (2007). In their research an UAV chased an aircraft using pure pursuit.








Leader (referen~e)




Chase UAiV



Figure 2-8. UAV using pure pursuit (Enomoto et al., 2007)

Petrinec et al. (2003) used pure pursuit based path following for autonomous vehicles in a

multi vehicle simulator to simulate factory floor shop. They point out that the look-ahead

distance is crucial for proper operation of the vehicle while using pure pursuit. Shorter look

ahead distances may cause oscillations whereas longer look ahead though smooth may take the

vehicle longer time to get back into the path. As they had to deal with varying look ahead

distances, they approximated large look ahead distances using piecewise linear smaller look

ahead distances. This reduced the lag due to larger look-ahead distance.









Headland Turning Applications

Headland in citrus groves is the space available at the end of each tree row. This space is

generally used for turning the vehicles navigating the rows. Headlands are characterized by a

ground space of at least 4.25m after the last tree in each row. The ground is usually covered with

grass. Beyond this space, there may be trees planted to act as boundaries of the grove or there

could be more open ground or some combination of both. An autonomous vehicle operating in

citrus groves needs to navigate the headlands to tumn and navigate other rows.

Miller et al. (2004) have reported that that the space required to tumn at the headland and

the time spent in turning, significantly affect the Hield efficiency. They also report that damage of

crops due to improper headland turn affects productivity. Kise et al. (2002) tested headland tumn

using spline function path planning method on a tractor using GPS based guidance system. The

method guided the vehicle along the desired path with error less than 20cm lateral deviation.

Hague et al. used vision based guidance in combination with dead reckoning to guide a vehicle

through crop rows. The guidance system detected end of crop rows using absence of crop row

information from vision and distance information from dead reckoning. The vehicle was able to

execute a u-turn at the headland. Hansen et al. (2005) analyzed headland tumns for a combine in a

Hield. They analyzed loop turn or u- tumn in particular. The turns were executed for entering

successive crop crows or for entering some row after skipping a few rows. They also developed a

model to represent the travel behavior and verified with experimental headland turns. Using this,

the turns were optimized for speed and time spent in turning and also the number of rows to be

skipped for optimal turning. Noguchi et al. (2001) implemented switch back turning maneuver

for a tractor to turn at the headlands of crops. The turning maneuver was implemented using

DGPS guidance and spline function for turning. Precise turns were observed by simulation and

experimentation. Oksanen et al. (2004) studied the behavior of tractor- trailer systems in









headlands of crops. They tried to solve the system's turning using optimal control and tried both

u turns and switch back turns. The behaviors were analyzed by simulation alone and

improvements were suggested using numerical methods for improved solutions.


d,

















Figure 2-9. Switch back turning implemented using spline functions (Noguchi et al., 2001)

Path Planning

Path planning aids the mobile robot in traveling from the beginning to the final task.

During navigation, conditions might be encountered that require the shortest distance to travel, a

safe path, etc. Path planning makes it possible to take care of all these options. Motion

traj ectories for robots are usually obtained by teaching from a human operator or by path

planning on a computer. Geographic Information Systems (GIS) is being widely used for

mapping (Noguchi et al., 2002) and path planning. Use of teaching-playback robots is thus

restricted to controlled environments, like mass-production factories. Traditional path planners

normally use heuristics or a performance index for path specification. A Digital image of the

route map could be used for path planning (Speigle et al., 1995). The path is segmented prior to

the actual navigation and marked for landmarks. Then an optimal path is calculated. Fu et al.










(2004) have developed a path planning algorithm based on Dijkstras algorithm for searching

shortest path. The algorithm searches a restricted area and makes use of the spatial feature of the

road network.

Current Commercial Systems

There are several commercial autonomous vehicles available for field applications. Most of

these systems use GPS positioning to aid in straight driving. Large farms employ these for large

scale spraying and harvesting in crops. These vehicles provide a visual feedback of the vehicle' s

deviation from the GPS guided path. The farmer can get feedback about his driving directions.

This method has increased the efficiency of operation in large farms. Most of these systems are

limited to straight path guidance in open fields. They are also not suitable for citrus groves since

the receiver in a grove does not properly receive satellites signals, which can be blocked by the

tree canopy.

John Deere, Moline, IL and iRobot Corp., Boston, MA are developing semi-autonomous

vehicles for military use, using a John Deere Gator vehicle. John Deere is also developing an

autonomous tractor without a cab and steering wheel. They could save a lot of weight by

eliminating these two elements. Omnitech Robotics, Englewood, CO, has developed several tele-

operated off-road vehicles, including a tractor.

The present commercial systems are opening a new path for advanced autonomous

vehicles. The farmers employing these systems at present are more likely to be receptive to a

fully autonomous system. A lot of research is required in developing a vehicle capable of

autonomous behavior in orchards and citrus groves. Present commercial systems often employ

GPS as the lone sensor. The use of additional sensors could make these vehicles more

autonomous and applicable in several areas ranging from crops to orchards.









CHAPTER 3
EXPERIMENTAL METHODS

This chapter describes the hardware architecture and the theoretical development of the

various components of the autonomous vehicle guidance system for navigating citrus groves.

Sensors, Control and Hydraulic Architecture of the Tractor

Guidance System Hardware Architecture

The maj or components of the autonomous navigation system are the sensors for

monitoring the environment of the autonomous vehicle, computer for collecting information

from the sensors and providing the autonomous operation and actuators for actuating the

steering .

From the literature, it is observed that machine vision has been very effective in vehicle

navigation. With the availability of fast computers, vision is a viable and promising option. For

this research, vision system is used to segment the path in the citrus grove alleyway and guide

the vehicle. Laser radar is a highly accurate sensor for measuring the distance to obj ects. Such

accuracy cannot always be expected with vision. Laser radar is available for this proj ect from a

previous research and has been a good distance measuring sensor in the literature. Therefore it is

also used as one of the guidance sensors. A common PC is used for processing the information

from the sensors and to implement high level guidance algorithms. A microcontroller is used for

performing low level controls and high speed operations which may not be possible using PC.

For controlling the steering of the vehicle, a servo valve was used to control the hydraulic

steering. For good control of the steering, feedback of the steering angle is essential. A rotary

encoder was available in the laboratory during the start of the proj ect, which was used as the

steering angle feedback sensor.









The architecture of the vehicle consists of the vehicle and the sensors and actuators

interfaced with the PC and the microcontroller. The ladar and the vision sensors are interfaced

with the PC whereas the hydraulic valve and the encoder are interfaced with the microcontroller.

The PC processes the vision and ladar information. It then sends required vehicle lateral

displacement information to the low level controller. The low level controller gets the desired

displacement information from the PC, computes and sends the necessary signal for the

hydraulic valve using the control algorithm. It also gets steering angle information from the

encoder for the control algorithm.


Electrohydraulic
Steering


Laser Radar I








Vision


Encoder


Figure 3-1. Architecture of the tractor guidance system

Vehicle

For navigating citrus groves, the autonomous vehicle was expected to be operated in harsh

outdoor environments most of the time, so the platform should have good off-road qualities.









Uneven paths are encountered in a citrus grove, so strong driving motors are also necessary. The

capacity of the vehicle power supply should provide for long hours of operation. The vehicle

platform is expected to meet these requirements. Adding the new technology to vehicles

currently being produced is expected to be less expensive than the cost incurred to design a

totally new vehicle. Moreover, a farmer is more likely to adopt an addition to an existing

technology rather than a totally new design. Present day tractors already possess the off-road

capabilities, so retrofitting an existing tractor seems to be a viable option. The development of

the autonomous vehicle guidance system was started in 2003. At that time, a John Deere 6410

tractor was readily available in the field automation laboratory of the Department of Agricultural

and Biological Engineering. It was decided that the tractor already possess the off-road ability

required for the autonomous vehicle. Significant cost and time could be saved by starting the

development on the tractor, rather than custom designing a new vehicle. A tractor has many

advantages in that it is a commercially manufactured, reliable, multi-purpose product technically

supported. All the mechanical and hydraulic systems conform to recognized standards. The

autonomous guidance system can be retrofitted and the human control interfaces could be

modified to include actuators. Therefore, development of the guidance system was started on

converting the tractor to operate autonomously.














Figure 3-2. John Deere 6410 tractor used for autonomous vehicle guidance development










Autonomous Steering

The steering mechanism on the tractor uses hydrostatic steering mechanism. The

illustration of the steering mechanism is shown below.


IHydraulic--
\Fluid


Steering --
Wheel

Steering --Steering
Valve Cylinder


Figure 3-3. Hydrostatic steering mechanism of the tractor

In this steering mechanism, the human operator turns the steering wheel. The steering

wheel is mechanically coupled to the steering valve. The steering valve can be thought of as the

direction control valve. When the steering wheel is turned, the steering valve direction control is

changed from no steering to flow in one direction or the other. The hydraulic fluid flows through

the steering valve and tries to move the piston in the steering cylinder. The steering cylinder can

be thought of as a double acting cylinder. Depending on the flow of fluid controlled by the

steering valve, the piston in the steering cylinder is moved to the left or to the right. The wheels

of the tractor are mechanically coupled to the shaft of the steering cylinder on either side. This

motion of the steering cylinder turns the tractor steering wheels to the right or to the left.

For the tractor to operate autonomously, it should be capable of steering by receiving

electrical signals from a computer. At the same time, it is important to have the ability to switch

to manual control whenever necessary. Since most of the hydraulic circuits in the tractor are

proprietary information, the maj or concern was that the existing system should not be tampered

with so that manual capability is not lost. Therefore a steering mechanism was developed to

make the steering automatic. The illustration of the hydraulic circuit for this conversion is shown









below. In the Figure 3-4, the hydrostatic steering mechanism of the tractor is shown on the left.

For autonomous steering capability, an electro-hydraulic servo valve was connected in parallel to

the existing circuit. This is illustrated in the right half of the Figure 3-4. The specifications of the

servo valve are given in the next section. The servo valve is similar to the steering valve

described above. The difference is that the servo valve takes analog voltage signals to control the

flow of fluids, instead of the mechanical turning of the steering wheel by the driver. An electrical

switch has been connected to the servo valve. The switch when turned on, makes the servo valve

operational. Once the servo valve is turned on, it is ready to accept electrical signals from the

computer to control the flow of fluid. The direction of fluid flow to the steering changes the

steering to the left or right as described for the hydrostatic steering. Since the valves are closed

center, when one valve is in operation, flow of fluid through the other valve is blocked. Further,

the existing steering circuit includes check valves which prevent reverse flow of fluid through

the valve when it is not operational.

It should be noted that, making the servo valve operational by turning on the switch,

transfers complete control to autonomous steering. Only when the switch is turned off, manual

steering is again made operational. Therefore the human driver can choose manual or

autonomous as required.

The supply and return fluid lines for the servo valve are tapped from the front end loader

supply system (Figure 3-5). The Figure 3-6 shows the retrofitted circuit with the servo valve. As

observed, the output from the servo valve is directed to the steering cylinder using a T joint,

along with the existing steering lines.





























Hydrostatic
Steering Electro-hydraulic
Steering





Figure 3-4. Electro-hydraulic circuit for autonomous steering

Supply for front
1C end loader


Servo
Vatve


















Supply
=Return


Figure 3-5. Rear end of tractor showing the supply to the servo valve













.gT Existing
Servo C,- .6steening
Valve ~l~~~Ei~~l lines to
r* I ..5 cylinder

Return rI ~id~B~,

Supply

New
lines to
cylinder



Figure 3-6. View under the cab showing the servo valve circuit

Servo Valve

The servo valve described in the previous section is an electrically controlled load sensing

servo valve from Sauer-Danfoss. It is a direction control servo valve with important features like

*Low weight

*Built in pressure relief

*11V-32V supply voltage

*3-position, Closed Center

The valve is also manually operable using a lever key.









Table 3-1. Specifications of the electro-hydraulic valve

Supply Voltage Range (5% ripple) 11V to 32V

Current consumption at
0.57 A at 12V supply
rated voltage

Neutral 0.5 Supply voltage


Signal voltage
A port <-> B port 0.25 to 0.75 Supply
voltage

Power consumption 7W

Hysterisis at rated voltage 4%
for one cycle

Reaction time from neutral
to max spool at 0.2sec
constant voltage

Oil viscosity range 12-75 mm /s

Maximum startup viscosity 2500 mm2/S

Oil temperature range 30 60 oC










12


PVP









PVB


lr-
I O '
II
I

Pressure relief valve
Pressure reduction valve for pilot oil supply
Pressure gauge connection
Plug, open centre
Orifi ce, closed cen tre
Pressure adjustment spool
Plug, closed centre
LS connection
LS signal
Shuttle valve


1 7 '$15
Main spool
LS pressure limiting valve
Shock and suction valve~, PVLP
Pressure co~mpensator
LS connection, port A
LS connection, port B
S uctio n valIve, PVLA
Load drop check valve
Pilot oil supply for PVE
Max. oil flow adjustment screws for
ports A and B
Sauer Danfoss PVG 32 valve


Figure 3-7. Servo valve showing the main parts (Source:
documentation)









Guidance Sensor: Camera

A video camera was chosen as one of the primary guidance sensors. The camera would

enable the guidance system to look at the environment in front of the vehicle and segment the

path from the background. From this information, the vehicle could navigate the desired path.

The video camera selected for this application was the Sony FCB-EX780S "block" camera that

is commonly used in commercial "camcorders." Its analog video output is standard NTSC with

both composite and s-video signal output, and is classified as high resolution with 470 TV lines.

The camera has a vast number of features that can be adjusted, including a 25x optical zoom,

focus, white balance, iris, shutter speed, etc. All of the functions can be controlled via RS-232

serial communications, and many functions can be used in manual or fully automatic mode. The

camera was purchased without a commercial casing and so the camera was enclosed in an

aluminum custom built casing with mounts for mounting on the vehicle.


Figure 3-8. Video Camera, one of the primary path finding sensors





























Figure 3-9. Camera mounted on top of the tractor cab

The Figure 3-9 shows the camera mounted on top of the tractor cab. Mounting the camera

at a high altitude gives a good view of the path for the vision system to segment well. The

camera is mounted at an angle of 3 00 to the horizontal. The Hield of view of the camera with this

mounting configuration ranged from a distance of 5m on the ground in front of the vehicle to the

infinite horizon of the path. The camera always initialized to the automatic settings on startup.

The automatic settings included automatic shutter speed, automatic aperture control and lx

zoom.

Frame Grabber Board

The frame grabber is a PCI version of the FlashBus MV Pro, which was added to the

computer managing the overall guidance system. The frame grabber converts the analog NTSC

video signal received from the camera described above, to a digital 640 x 480 RGB bitmap

image. The frame grabber is equipped with a video scaling processor that allows for PCI direct

memory access (DMA). This allows for the digital image to be placed directly into system

memory without the need for any CPU cycles. The frame grabber also has several features that










include RS-232 serial communications, 12 Volt power, digital to analog converter, and

programmable look-up-tables for manipulation of the RGB image.

















Figure 3-10. Frame grabber board

A custom cable was attached to the video camera with the connector to interface with the

frame grabber. The software was written to communicate with the frame grabber and obtain

images to be processed to determine the path for the vehicle navigation.

Guidance Sensor: Laser Radar

The laser radar (ladar) system used in this study consisted of a Sick LMS-200 ladar sensor

(Sick AG, Waldkirch, Germany). The LMS-200 is a 180 degree one-dimensional sweeping laser

which can measure at 1.0/0.5/0.25 degree increments with a 10 mm range resolution and a

maximum range of up to 80 m. The LMS-200 can be operated at 3 8.4 KBaud using RS232

protocol, giving a sweep refresh rate of approximately 7 Hz or at 500kBaud using RS422

protocol, giving a sweep refresh rate of approximately 35Hz. For operation at 500kBaud, a

special high speed serial communication card was used. The ladar also consists of a 24V external

power supply to power the unit.









MB


-"A
Figure 3-1 1. Laser radar and its mount. A) Laser radar. B) Laser mounted on top of the tractor.

The laser radar is mounted on top of the tractor cab just below the camera. It is also

positioned at 300 to the horizontal like the camera. This position allows for the laser radar to scan

the front of the tractor for obstacles. The laser radar was used as the second guidance sensor to

determine the path for navigation.

Encoder

For the purpose of feeding back the steered wheel angle to the control system, a rotary

encoder was used. A Stegmann HD20 Heavy Duty encoder was selected.














Figure 3-12. Stegmann HD20 Encoder

The maj or features are

1024 pulses/revolution

8-24V input/output





* 0.23Kg, 2inch diameter

* Max speed 3000rpm at high load

* Output: Two square waves in quadrature


Channel


Von

AVOL


VoH

Vot


Figure 3-13. Output from the encoder shows two waves in quadrature

As shown in the Figure 3-13, by measuring the phase shift between the two square waves,

the direction of rotation can be calculated. The number of pulses after the last counted pulse

gives the amount of rotation.



















Figure 3-14. Encoder mounted on the tractor axle










The encoder was mounted on the shaft of the left wheel using a linkage assembly as shown

in the Figure 3-14. When the steering was performed, the encoder assembly rotates and the

signal from the encoder can be counted for measuring the steered angle. The encoder along with

the linkage was calibrated. The amount of rotation of the encoder shaft is different from the

amount of rotation of the wheel because of the linkage assembly connecting the encoder and the

wheel .

Computer

An industrial PC was used for performing high level algorithm processing and managing

the overall guidance system. The PC used in this research was the Advantech PCA-6106P3-C,

Pentium 4, 2.4GHz processor containing two RS232 serial ports. It was mounted in the cabin

next to the driver' s seat along with the keyboard and monitor. The operating system used was

Windows 2000 Professional. The software used for implementing the algorithms for vision and

laser radar is Microsoft Visual C++.Net. Use of this software allows for the creation of GUI for

analyzing the algorithm behavior in real time. One of the drawbacks of using windows is that it

requires a time of about 15Smsec between successive runs of the algorithms, for housekeeping.

This speed for processing is sufficient for the vision and ladar algorithms. But this is too slow to

process the information from higher speed sensors such as rotary encoders.


IIA I B
Figure 3-15. Instruments mounted in the tractor cabin. A) Computer. B) Computer, monitor and
keyboard mounted in the cabin.









Microcontroller

For the rate of steering turn of the vehicle, a high sampling rate of the encoder was

anticipated. Initially, the encoder was interfaced using a National Instruments data acquisition

board interfaced with the computer. When initial laboratory tests were conducted to measure the

output from the decoder, the encoder shaft was manually rotated and the pulses were read. It was

found that high sampling rates could not be achieved using this interface setup. It was also

anticipated that the operating speed of the Windows operating system may not be fast enough for

high speed control. Therefore, the Tern microcontroller was used for low level high speed

operations. A microcontroller was used for high speed low level control operations. The

microcontroller used in this research was the 586 Engine controller board with a P50 expansion

board from TERN Inc. It is a C++ programmable controller board based on a 32-bit

100/133MHz AMD Elan SC520 microprocessor. Its important features are as follows: 32 PIOs,

7 Timers, 19 channel 12 bit ADC, 8 channel 12 bit DAC, 2 quadrature decoders, 8 opto-couplers,

14 high voltage IO, Serial port and Ethernet. The microcontroller was used for processing

encoder data and for controlling the servo valve for steering, by sending analog voltages. The

microcontroller was mounted in the tractor cabin below the PC in a closed box.
















Figure 3-16. Microcontroller










Amplifier Circuit


Vs Vs
The servo valve requires control voltage in the range of -3 V to +3 V with a supply
2 2

voltage of Vs or, if the supply voltage is 12 V, then the control voltage is expected in the range

of 3 V to 9 V. Therefore the off position or closed position of the valve is at 6V, which is the

middle value. Voltages from 6V to 9V correspond to flow of fluid in one direction and 6V to 3V

for the opposite direction. The microcontroller analog output has a range of 0-2.5 V. So to

amplify and shift the voltage from 0-2.5 V to 3-9 V, an amplifier circuit was necessary. The

circuit consisted of an LM324 Op Amp as the amplifier. The 12 V supply for the circuit is taken

from the tractor's battery unit. The input to the circuit is the 0-2.5 V analog output from the

microcontroller. This 0-2.5 V range is scaled to 3 to 9 V analog voltage in the output of the

circuit. This is the control voltage for the servo valve. It is observed that the tractor battery

voltage varies from 12 V to 13 V after starting the tractor or when the throttle is increased. The

amplifier circuit takes the same supply power as the hydraulic valve. Therefore whatever the

supply voltage is from the tractor battery, the circuit scales it accordingly.



















Figure 3-17. Amplifier Circuit










DGPS Receiver

A DGPS receiver was used to measure the vehicle displacement while conducting tests to

determine the dynamics of the vehicle. A John Deere Starfire Differential GPS was used.























Figure 3-18. DGPS receiver mounted on top of the tractor cab

RS-232 serial communication was used to communicate with the DGPS receiver and

position information was obtained at a rate of 5Hz. According to the manufacturer' s

specifications, it has real time accuracy better than 1ft. A 24V external battery was used to power

the DGPS receiver.

Power Supply

The PC, the monitor and the laser radar require a supply voltage of 120 V AC and the

hydraulic valve and the microcontroller require a 12 V DC. Power from the tractor battery is

tapped and given as input to an inverter. The inverter supplies the PC, the monitor and the laser

radar. The supply for the microcontroller and the hydraulic valve is taken from the cigarette

lighter power source in the tractor cabin.

























Figure 3-19. Inverter mounted in the tractor cabin

Inertial Measurement Unit

In this research, an Inertial Measurement Unit (IMU) was used to measure the direction of

travel of the vehicle/tractor. The IMU used was the 3DM tri-axial IMU from Microstrain, USA.














Figure 3-20. Inertial Measurement Unit used in this research

This IMU is a 3-axis sensor measuring 3600 of yaw, 3600 of pitch and 1400 of roll. The

measurement is done using arrays of magnetometers and accelerometers. RS-232 standard of

serial communication was used to obtain the measurement values from the sensor. The IMU

requires a supply voltage of 5.3V to 12V. Calibration of the IMU was performed before using it

in the vehicle. Calibrations should be repeated when mounting the IMU directly over metal. The

IMU was mounted on the computer inside the cabin for the tractor. The IMU was mounted such










that the yaw measurement is zero when the rear of the vehicle points towards the geographic

north. The sampling rate obtained from the sensor using the guidance program was about 40Hz.

Speed Sensor

An ultrasonic speed sensor was used to measure the forward speed of the tractor. The

sensor used was the Trak-Star from Micro-Trak systems, Inc., USA. The sensor was mounted

under the tractor body such that the ultrasonic waves emitted from the sensor hit the ground. The

output from the sensor was digital voltage calibrated against speed by the manufacturer. The

sensor was interfaced using a PIC microcontroller to convert the digital signal and give an output

in RS-232 serial communication standard. The speed information was obtained using serial

communication using the guidance program.

Experimental Approaches and Theoretical Development

Vehicle Dynamics, Calibration and Steering Control of Tractor

Valve calibration

The PVG 32 proportional valve used for steering control is a high performance servo

valve. The amount of opening of the valve, and thereby the wheel turn, is dependent upon the

control voltage applied and the time for which it is applied. The effect of a phenomenon like

hysteresis is compensated by the built-in servo control in the valve to position the spool exactly

at the commanded position. To understand how the wheels turned with voltage and time,

calibration tests were conducted. These calibrations are important for the control system

implementation. Tests were conducted to make a look-up table of voltage, time and the

corresponding wheel angle. In this test, the tractor engine was started and the vehicle was kept

stationary. A program was written in the microcontroller to send a constant analog voltage to the

servo valve. The wheels were allowed to turn 100 degrees and the valve was turned off or closed.

The time required to turn this angle was measured. Tests were conducted at various constant










voltages. Voltage vs Time plot was plotted as shown in the Figure 3-21. From the plot, it can be

observed that the true close position of the valve is at 6.8V rather than exactly at 6V. This could

be due to variation of supply voltage from the tractor from 12V to 13V or due to the dynamics of

the valve itself. It can be observed from the plot that for voltages further from the center value,

the time required to turn is less as expected. The shortest time is 0.5s for 8.5V and 5.5V. Further

values of voltages were not applied as such high speeds of turn are not expected of the tractor

during autonomous guidance. For voltages less than 7.2V or more than 6.5V, there was no

significant turn of the tractor wheels. It is also observed that right turn has a larger range of

voltages compared to left run. This could be due to the dynamics of the valve and the dynamics

of the various components of the hydraulic circuit.


-* Right
8 5 -c-Left
















06 075 09 125 43
Time (sec)

Figure 3-21. Voltage and time required to turn 100 degrees

It is to be noted that the flow of hydraulic oil in the lines is dependent on the viscosity of

the oil. The viscosity of the oil is in turn dependent on its temperature. The viscosity of the oil

varies from 12 mm2/S to 75 mm2/S, as the oil is heated up. During the course of the experiments,

it was observed that an initial period of about 15 min is required after starting, for the oil to reach










a constant viscosity. For experiments conducted in the field, the tractor was driven out to the

field. This time was sufficient to heat the oil. As a result, no startup time was required before

each experiment.

Further, the valve dynamics frequency is much higher than the vehicle dynamics

frequency. Therefore the effect of valve dynamics on the vehicle control can be ignored without

severely degrading the performance.

Encoder calibration

The next step was the calibration of the encoder and the linkage assembly. As discussed

earlier, the encoder outputs 1024 pulses per revolution. The encoder position had to be calibrated

against the wheel angle. This calibration is necessary for the steering control.



















Figure 3-22. Encoder calibration

The tractor was positioned at a place in the laboratory. Angular positions were marked on

the ground as shown in the Figure 3-22. From the center position, the steering wheel was rotated

to position the wheels at different angles. The number of pulses to reach different angles was

noted using a program written in the microcontroller to decode the pulses from the encoder. The

angle range for this calibration was from 0 to 45 degrees. The wheel center was treated as zero









and turns were to a maximum of 45 degrees on either side from the center. It is to be noted that

the wheel center was calibrated by trial and error. The maximum angle was 45 degrees as this

was the mechanical turn limitation of the wheels. From this calibration, it was determined that a

rotation of 1 degree caused the encoder to output 7 pulses.

Vehicle dynamics

The vehicle dynamics has to be determined for proper design of the control system. The

vehicle dynamics relates the lateral motion of the vehicle to steering inputs. Open loop frequency

response tests were conducted to explore the vehicle dynamics. The open loop frequency

response test is good for modeling transfer functions from physical data.

Open loop frequency response. The frequency response is a representation of the

system's response to sinusoidal inputs of varying frequency. For the open loop frequency

response, an algorithm was implemented in the microcontroller to move the wheel angle as a

time based sinusoid. The computed voltage for the required frequency of turn was sent to the

valve, from the microcontroller, which turned the wheels at various instances of time. The

encoder fed back the wheel angle information. This closed loop control caused the vehicle to

execute a sinusoidal motion. For the tests, the engine was set at a constant speed of 2200 rpm.

Tests were conducted in an open field at three different vehicle forward speeds of 4 mph, 7 mph

and 10 mph. These speeds were representative of operating speeds expected while navigating

citrus groves. The frequency was varied from 0.07 Hz to 0.3 Hz. The lowest frequency was

chosen as 0.07 because at frequencies below this and at maximum speed, the vehicle veered out

of the test area. Above 0.3 Hz, the time period of wheel turn was very low, such that the very

high velocity of turn was required. This is not suitable for the tractor. These constraints limited

the bandwidth of tests that could be conducted. Time, wheel angle and vehicle position were

recorded. The DGPS receiver was used to record the vehicle position. There were no tall trees or










buildings near the vehicle in the test area to affect the GPS signals to the receiver. Using the

position information collected with the DGPS receiver, the position of the vehicle was plotted in

ArcView, software commonly used for analyzing GPS data.

Table 3-2. Open loop frequency response tests at various speeds and frequencies

Frequency (Hz) 0.07 0.08 0.09 0.1 0.2 0.3

Sed(mph)
4 X X X XX X


7 X X X XX X


10 X X X X X X


-N


5m


Figure 3-23. Vehicle position plotted in ArcView (0.2Hz, 10mph)

The vehicle lateral deviation plot in the Figure 3-23 shows the sinusoidal movement of the

vehicle. It was observed that the vehicle's lateral response contains a sinusoidal component at the

frequency of the steering excitation. ArcView has a tool to measure distances on its plots. Using

this tool, the gain or the change in the position of the vehicle from a positive peak to the

successive negative peak was measured, as shown in the Figure 3-24.




























Figure 3-24. Measuring gain from the position plot

This gain corresponds to a particular frequency at which the tractor was steered. This

calculation was performed for all the frequencies and speeds used in the test. The gains were

plotted against the frequencies (Figure 3-25).

The Figure 3-25 shows the open loop frequency response for 4 mph, 7 mph and 10 mph.

As observed, the frequency response was relatively linear. The gain relates the lateral deviation,

d(s) of the vehicle to the steering angle change, a(s).


i i i i ii 4mph
+ 0mph
25 C----------- -- -------- ------- ~...~:~. --------- -- .o 7mph




Magnitud'es -dB)~~~j~~~~~j~~~~j~~~













Frequency (Hz)
Figure 3-25. Open loop frequency response for 4 mph, 7 mph and 10 mph









This is represented by the transfer function

d(s)
G(s) (Eq. 3-1)
a(s)

Determining the Theoretical Model. From the open loop frequency response, it was

observed that the vehicle lateral response has the same frequency as the steering excitation. It is

also known that when a constant input is given, that is, if the steering angle is kept constant, the

lateral response is initially exponential. An exponential response to a constant input is exhibited

by a double integrator function. Double integrator also introduces a 180 degree phase shift in the

response. Therefore, if the input is a sine function, the output is a cosine function shifted by 180

degrees, or otherwise, a negative sine function. From the bicycle model discussed in the

literature review, it is known that the vehicle dynamics model has two poles. This information

also corroborated that the vehicle model could be represented by a double integrator.

Hence, the transfer function is determined to be of the form


G(s) (Eq. 3-2)
s^'2

The value of gain K depends on the forward speed of the vehicle. The frequency response

of the transfer function, G(s), was plotted using different values of K, to fit the actual response

(Figure 3-26). K was found to be 0.02, 0.07 and 0.13 for 4 mph, 7 mph and 10 mph respectively.

It can be concluded that an increase in velocity can be represented by an increase in the gain K,

under identical conditions. This model is similar to the bicycle model, except that the bicycle

model has two additional pole-zero pairs.



















Magnitude( (dB)









10 10
Frqunc (z
Figure 3-2. Theoretial mde fo te tre sped

To es te aldit o te hereica mde popse, te hereicl ode ws rete i


Simuink nd imulted Themode isshow inthe igur 3-7 an cosist of wo ntegato
blocs cnnetedto agai. Te iputis asigal eneatorandtheoutut s a cop. Snusida

whee ange iputswer givn atvarous reqencis. Te eperientl reultsfro theope
loop freueny espns tets er fondto gre wth hesimlatd esutsas how i Fiur
3-28.


ine aveIntegrator1 Integrator Gain

Figure 3-27. Simulink model for the vehicle dynamics




























Time~sec)

Figure 3-28. Sinusoidal response of the theoretical model and the vehicle (10 mph, 0.2 Hz)

Control system

Based on the open loop frequency response tests described previously, a control system

was designed to control the steering angle. PID control system has been found to work well for

vehicle guidance in the literature. It is also a simple and reliable method. Hence PID control

system was chosen for implementation (Figure 3-29).



Error in Desired Actual
Desired DipaeetAgeAgeLateral

Displacement ) isplaceme:













Figure 3-29. Guidance control block diagram










The steering control system is developed to interface between the PC and the servo valve

and would be implemented in the microcontroller. The input to the PID controller would be the

lateral displacement error of the vehicle from the center of the path to be navigated. The output

of the PID controller would be the desired steering angle to reduce this error. This desired angle

is input as analog voltage to the servo valve. The actual steering angle is fed back to the PID

controller using the rotary encoder. This forms a closed loop control for steering. The lateral

error of the vehicle from the path center is fed back using the primary guidance sensors namely

vision and ladar. This feedback of the error forms the overall closed loop of the guidance system.

The control system was first designed and simulated in Simulink. Simulations of the

control system can give good estimates of the controller gains, which can then be further tuned in

the actual vehicle. The Simulink model is shown in the Figure 3-30. A rate limiter was added to

simulate the fact that the wheel turning is not instantaneous. A saturation block was added to

simulate the maximum voltage that can be given as input to the valve.





Sdu/dt
Kd Derivative

3I I


00 1Scopel









Figure 3-30. Simulink model of the vehicle control system










Ziegler Nichols method of tuning for closed loop systems was adopted for initial tuning of

the PID controller. In this tuning process,

* The system was given a small proportional gain. Integral and derivative gains were
removed.

* The gain was increased until sustained oscillations were obtained as shown in the Figure 3-
31.


Time(sec)
Figure 3-31. Simulated sustained oscillation with a gain of 1.8 for the 10 mph model

* The gain Gu and time period of oscillation, Tu were recorded.

* The required proportional gain Kp and derivative gain Kd were calculated using the
Ziegler Nichols formula

a. Kp = 0.6Gu (Eq. 3-3)

b. Kd =Tu/8 (Eq. 3-4)
































0 10 20 30 40 60 60
Time(sec)
Figure 3-32. Simulated step response with the initial tuning parameters (10mph, Kp=1.08, Kd =
1.25)

*The parameters were further tuned and an integral gain was added for removing the steady
state error.

*The controller was tuned for zero steady state error and low overshoot. A large settling
time of the order of 20 sec was acceptable for the vehicle. This is because, if the controller
were tuned for a small settling time, the steering was very sensitive to very small errors
and caused the wheels to chatter.

The optimum PID gains were found to be:

Proportional Gain, Kp = 1

Derivative Gain, Kd = 4

Integral Gain, Ki = 0.09

When the control system was implemented in the tractor, the same parameters determined

for the simulation were used to control the vehicle and no significant further tuning was required.




























O 1020 30 40 50 BI
Time(sec)
Figure 3-33. Simulated step response with the final tuned parameters (10mph)

Table 3-3. Performance criteria of the controller (Simulated results)
Rise time Settling time Delay time Overshoot Steady
Parameter (sec) (sec) (sec) (cm) state error

Model

4 mph 15 10 8 1 0

7 mph 12 15 6 1 0

10 mph 12 20 5 2 0


Vision and Ladar Independent Steering

The primary purpose of this proj ect was to guide the vehicle through the alleyways of a

citrus grove. However, the performance of the vehicle guidance needs to be tested well before

testing in the grove. Also the location of the groves not being in proximity to the laboratory

poses a challenge of towing the tractor for a long distance. So if a modification to the equipment

or algorithm is required, a lot of time is wasted in towing the tractor. As a result, there was a

need for the test area to be in the vicinity of the lab.









It was decided that the tests could be carried out in an open area in the university campus.

Hay bales were used to form an artificial boundary of a path and tests were conducted there. Hay

bales were selected because it was expected that the tractor might not navigate very well during

the initial stages and might run over any artificial boundary. The hay bales provided a compliant,

yet 3D boundary that could be sensed by the vision and ladar sensors.

Computer vision algorithms had to be developed to segment the path from the

surroundings. Two algorithms had to be developed, one for the citrus grove environment and one

for testing with hay bales. As discussed earlier, the camera was mounted on top of the tractor at

an angle of 450 with the horizontal. The camera was pointed just in front of the tractor hood. This

aided in looking right in front of the tractor. The height of the camera gave a good Hield of view

for the vision algorithms. The mount of the camera and ladar are shown in the Figure 3-34.


Figure 3-34. Camera and Ladar mounted on top of the tractor cabin









Vision for path with hay bales

Typical images as seen with the video camera mounted on the tractor are shown below.

The images show hay bales placed in parallel rows separated by a distance of about 3m. These

two hay bale rows are representative of tree rows in the grove alleyway. The path is the region

between the hay bale rows.


Figure 3-35. Typical images seen while navigating through hay bales

The images were taken with the tractor standing in the middle of the path. The vision

algorithm should be capable of segmenting hay bales as boundaries and the region between the

hay bale rows as the path. It is observed that the ground could have grass in some places, but not

in all regions of the path. Also hay bales cast a shadow as does the tractor cabin. These shadows

pose a challenge to developing the vision algorithms.










The image size obtained from the video camera is 640pixels by 480pixels. The image was

not resized. It is an RGB image with three values for each pixel, corresponding to red, green and

blue values. Also the image is mapped from the bottom left as the reference. This reference is

based on the Windows bitmap standard adopted in the software. The Figure 3-36 shows the

image coordinates used in the algorithm.

O 480 640 480







0,0 > 640,0
Figure 3-36. Image Coordinates

The vision algorithm is required to work effectively in real time. The operating speed of

the algorithm is dependent on the speed of the vehicle. For this proj ect, only one PC with the

2.4GHz processor was used to process all the high level algorithms. It was also anticipated that

ladar based segmentation would also be required to run in parallel with vision. In the future,

more algorithms using sensor fusion, headland turning, path planning and obstacle detection

were also anticipated to be implemented. Therefore to run all these algorithms in parallel in a

single processor and to have a real time vision algorithm, color based segmentation was selected

as the method of segmentation, as it has been found to work well in other researches for

navigating tractors through crop rows. Color based segmentation is also relatively fast compared

to other complex methods such as texture based classification methods.

Thresholding. Thresholding was the first process in the vision algorithm. No

preprocessing such as filtering was done to the base image. This was because, preprocessing

increased the processing time without offering any significant improvement in segmentation.










Several images of the path in the hay bale track were captured. Then the Red(R), Green(G), and

Blue(B) intensity color plots of different regions in the image were plotted. The plots are shown

in the Figure 3-37 and Figure 3-38 corresponding to hay bales and path in between.

Red Green Blue





Figure 3-37. R, G, B color plots of hay bales (X axis: Intensity-Range: 0-255, Y axis: no. of
pixels)
Red Green Blue





Figure 3-38. R, G, B color plots of grass path (X axis: Intensity-Range: 0-255, Y axis: no. of
pixels)

From the intensity plots, it is observed that segmentation based on color is feasible as the

hay bale and path are clearly separable in the three color channels. These plots also give an

estimate of the threshold value to be chosen. From the plots, it is observed that the path has color

intensities below the threshold whereas the hay bales have color intensities above the threshold

value.

Color plots of several images were observed and the threshold values for red, green and

blue were chosen by comparing pixel values of hay bales and ground. Once the threshold values

were chosen, all the pixels in an image are classified. The classification is performed such that a

pixel with R, G, B intensities corresponding to hay bale or intensities higher than the threshold is

changed to white whereas a pixel with path color intensity values or intensities below the

threshold value is changed to black. The result of thresholding is a binary image. The result of

thresholding a sample image and converting to binary is shown below.

























Figure 3-39. Thresholded image

Cleaning or filtering the image. As observed, the thresholded image has a lot of noise

where lots of isolated pixels on the path are classified as hay bale because of the higher intensity

of some regions without grass. This isolated pixel noise needs to be cleaned. For this, a 20 pixel

by 20 pixel window is created and moved through the image. In each window, the number of the

white pixels is counted. If more than 75% of pixels falling inside the window are classified as

white, then the window is determined to fall on a hay bale. Therefore, the remaining black pixels

in that window are changed to white. If less than 75% of pixels falling in the window are white,

then the pixels in the window are determined to be path and all the white pixels are changed to

black. This process is performed for the entire image. A cleaned image is shown below. The

cleaning process has removed the noise and formed clearly defined hay bale obj ects or path.












Figure 3-40. Cleaned image










Isolating the left and right boundaries. As observed in the previous step, a part of the

path region is still classified as hay bales because of its color. For removing these incorrectly

classified regions, each row in the image is scanned from left to right, as well as right to left. The

width of the hay bale region or white region in each row is calculated for both left and right

boundaries. The mean width of the hay bale is chosen apriori. Based on this information, any hay

bale region that is much wider than the mean width is shrunk to the mean width, as shown in the

Figure 3-41.
















Figure 3-41. Boundary Isolated image

Line fitting. As observed in the boundary isolated image in the Figure 3-41, the

boundaries are j agged. To clearly define the path, lines could be fit to indicate the path

boundaries. For this, each row of the image is scanned and the interface between the path and

hay bales is identified for each row. This is determined by a transition from white to black or

vice versa. The transition point of each row is stored in an array. Then a line is fit using the least

squares method.

Equation of a straight line is given by

y = ax + b (Eq. 3-5)










_(Cy)(Ex^2) (Ex)(Exy)
a (Eq. 3-6)
n~x^'2 (Ex)^`2

n(Exy) (Ex)(Cy)
b (Eq. 3-7)
n~x^'2 (Ex)^`2

Where, (x, y) is the pixel coordinate, 'a' and 'b' are constants, n is the number of points for

which the line is fit.



















Figure 3-42. Lines fit to the boundaries

The original image after processing and line fitting is shown above. Note that the line fit

for the left boundary has a shift to the left of the true boundary due to the presence of shadows on

the hay bale. The flowchart of the developed vision algorithm is shown below.

Path Center. Once the lines are fit for both left and right boundaries, the next step is

determine the path center. The region between the lines is determined as the path from the

previous steps. The entire image is scanned row by row. At a row 'n', the position of the left and

right boundaries are known based on the line fitting as Xleft[n] and Xright[n]. The center point

of the path in that row, Xcenter[n] is determined as the mid point between the line locations in

that row as









Xcenter[n] = (Xright[n] Xleft[n])/2 (Eq. 3-8)

This process is repeated for each row and all the center points are stored in an array. The

average of the center points is chosen as the current center of the path in the given field of view




Xcurrenteenter = (i Xenlter[i])/n (Eq. 3-9)


The current position of the vehicle center is chosen as the mid point of the image at column

240 i.e. half of the image width (480). The error of the vehicle from the path center is determined



Error = Xcurrenteenter 240 (Eq. 3-10)

This error in pixels is calibrated against real distances using prior calibration of the field of

view. This transformation is explained in a subsequent section. Once the error is determined, the

error is passed to the control system in the microcontroller using RS232 serial communication.

The flowchart of the entire vision algorithm is shown below.

Algorithm flowchart









































Figure 3-43. Machine vision algorithm flowchart

Vision for citrus grove

Once the vision algorithm was developed for navigating the test track of hay bales, the

next step was to adopt the vision algorithm for navigating the citrus grove alleyway. The vision

algorithm for citrus grove navigation follows the same procedure as for the hay bales path.

However a special procedure has to be followed for thresholding effectively.

Typical images of the alleyway seen by the camera in the grove are shown below:





































Figure 3-44. Typical images for navigation in citrus grove

It was expected that during the course of the day, as the sun' s position changes, the amount

of shadow on the trees and the path varies. Also the color saturation of the trees and the path

varies to an extent. It was felt that an analysis of this variation is necessary to ensure proper

design of the algorithm.

In the months of July and October of 2004, trips were made to the grove for three

consecutive days. Two cameras were mounted on a tripod on the hood and top of a truck to

simulate the camera being mounted on the tractor. Two cameras were used to study the effect of

the height of the camera on the image. Pictures were taken from 9 am to 6 pm at half-hour

intervals. These pictures helped to analyze the intensity variation in the images over an entire

day. The effects of clouds were also included. These images helped rule out concerns about

varying light conditions.



































Figure 3-45. Picture collection in the grove

As observed with the sample images shown earlier, in some cases, the shadow tends to

cover a large portion of the path. This type of shadowing occurred mainly in the images taken at

early mornings till about 10am and evenings after 4pm. During the middle of the day, the

brightness in the image was evenly spread without much shadow. This variation can be attributed

to the motion of the sun through the day at various angles to the horizon. Another variation of the

brightness or shadow occurs between summer and winter because of the change in sun' s

declination to the earth. Another cause of variation is due to the direction in which the rows are

planted i.e. north-south or east-west or any other direction. Among all these variations, the

common factor is that the brightness can change during the day and the shadows may or may not

be present.

Segmentation. In the field of view of the camera in the grove, is the grove alleyway. This

scene consists of trees and path or ground. This can be seen in the images shown before. As with










the hay bale track, the R, G, and B intensities of trees and ground can be plotted and is shown in

the Figure 3-46. This plot was created using a images taken from the database corresponding to

both trees and ground. It is observed that for a threshold value of 100, trees and ground can be

segmented well based on color.


10000

9000 Trees

8000 B

7000-

6000-



4000 -I I

PixelSOOD

2000 u "

1000 Intn h

0 50 100 150


250


300


Figure 3-46. R, G, B intensity plot for ground and trees in grove alleyway

However, to accommodate the changing shadow or brightness scenario, the thresholding

algorithm has to be modified. This method uses the prior information that an image with a lot of

shadows tends to have dark trees on one side and bright trees on the other. In Figure 3-47, the

two extreme images are shown where on image corresponds to a uniform brightness and the

other corresponds to image with shadows. In the images with shadows, trees on one side are

darker than the trees on the other. That is, they have very low intensities on one side of the trees.










11


high
250 ~ high 250 inest
intensity
low
intensity o




Pixelen it poiinPx o"o
Figur 3-7 mae ndter re intensity poiears h etro h mg.A o





InteFigure 3-47, thee plos ho their greenchne intensity profiles ofnon-shetr f h iag.ad ownd



shadow images, respectively. In non-shadow images, there is low intensity on left side

corresponding to the trees on the left, high intensities in the middle corresponding to the path and


again low intensities in the right corresponding to the trees. For the shadow image, there is high

intensity on the left corresponding to one side of the trees and half the path, and low intensities

on the right corresponding to half the path and the right-side trees. It should also be noted that in

the shadow images, the low intensities are mostly below 50 and in non-shadow images, the low

intensities are mostly below 100.

In this method, it is initially assumed that there are many shadows in the image. A

threshold value corresponding to this is used; i.e., of the order of 50 and the image is

thresholded.
































Co .: --isl~ D
Figure 3-48. Thresholding. A)Shadow image. B) Non shadow image. C) Thresholded shadow
image. D) Thresholded non-shadow image.

Figure 3-48 shows how a shadow image and non-shadow image are segmented by using a

low threshold. After thresholding, the number of pixels classified and the locations of classified

pixels in the image are calculated. If the assumption that it is a shadow image was true, a lot of

pixels are classified as trees in one half of the image, corresponding to the shadow image profile.

This way, trees on one side have been found. Then the algorithm proceeds as with the hay bales,

but finds trees only on one side. Then the vision algorithm tries to keep the vehicle at a fixed

distance of half the aisle width from one side which has been identified as trees.

If the assumption was not true, then not many pixels are classified as trees on one side.

Then it is concluded that there is not much shadow in the image and a new threshold value,

higher than the previous value, is used. Now the trees on the two sides are found leaving out the

path (Figure 3-49). The algorithm follows the rest of the procedure as with hay bales. The

algorithm finds the path center between the trees on either side.








































Figure 3-49. Classified images without and with shadows

In the Figure 3-49, the left image is a segmented non shadow image and the image on the

right is a segmented shadow image. Based on the positioning of the camera, it is observed that

the lower half of the image contains information corresponding to a distance from 10ft to 20ft in

front of the vehicle. It was decided that the useful information for keeping the vehicle in the

center of the path is in this lower half of the image. Hence processing is restricted to the lower

half of the image. This reduces the processing time considerably.

Error calculation. The camera is mounted on the center of the vehicle' s chassis. The

current lateral position of the vehicle corresponds to the image center. That is, the image

image 0I idthr
coordinate of the current position is 320.









Desired position = (Right side tree boundary Left side tree boundary) / 2, for non-shadow

images (Eq. 3-11)



Desired position = Right side tree boundary Fixed distance, for images with shadow

(Eq. 3-12)

Error = Desired position current position (measured in pixels) (Eq. 3-13)

Pixel distance to world distance conversion. It is necessary to determine the path errors

in real world distances by converting pixel distances to real world distances. The camera is

positioned at a fixed angle, and so the look-ahead distance is fixed. A meter ruler was placed in

front of the tractor in the camera' s field of view. From the pixel length of the ruler seen in the

image, the pixel to world conversion was determined as

100 cm = 177 pixels

However, an obj ect in the lower half of the image looks bigger than if it were in the upper

half of the image. A correction factor of 10 cm is necessary depending on where the distance is

measured in the image. This variation was taken into account in the algorithm.

Error in position = Error in pixels x 100/177 (cm) (Eq. 3-14)

The real distance error information is then passed to the control system implemented in the

microcontroller. The control system uses this error to steer the vehicle to the center of the path.

The flowchart of the overall vision algorithms is given in the Figure 3-50.

The operating speed of the vision algorithm, that is, the process of getting an image and

processing it to obtain the path navigation error is performed at a rate of about 15Hz-20Hz.
























































Send nroro to the
Moicrotrolle r va
serIaP l conunusuction


Figure 3-50. Machine vision algorithm flowchart

Required heading of the vehicle using vision. The vision algorithm uses color based

segmentation of trees and ground followed by morphological operations to clean the segmented



100










image. The path center is determined as the center between the tree boundaries. The error is

calculated as the difference between the center of the image and the path center identified in the

image. The error is converted to real world distance using prior pixel to distance calibration. To

calculate the required heading of the vehicle, a line is fit for the path center in the image

representing the entire alleyway. The angle between this line and the image center line is

determined as the required heading for the vehicle to navigate the alleyway with low lateral

error.


Vehicle
Celiter Projed

.,. T Tru

Center~




Required
heading





Figure 3-5 1. Calculating the required heading of the vehicle

The Figure 3-51 shows the result of vision based segmentation. The solid line(blue) shows

the true path center of the path and the dashed line(red) shows the current center of the vehicle.

The angle between the true path center line and the vehicle center line is the required heading of

the vehicle in the path for long term accurate navigation.

Laser radar (Ladar) for path detection

The ladar is a centimeter level accurate planar distance measuring sensor. It is well suited

for navigation problems in which the vehicle must navigate through narrow passages. The path










width in an orange grove is not big enough for the tractor to avoid an obstacle by going around it.

However, ladar can be used for navigating through the pathway.

Data Obtained from ladar. The ladar used in this study was a SICK LMS 200, which

scans horizontally through an angle of 1800 or 100 0, depending on the mode selected, giving a

distance value corresponding to the angle. The ladar was operated at 38,400 baud using RS232

serial communication for guiding the tractor. The ladar can be operated in several modes, e.g.

1800 at 1 degree or 0.5 degree increments, and 1000 at 1 degree or 0.5 degree increments. For

navigating the test path, a scan of 1000 at 0.5 degree increment was used. This was because the

path and the boundaries could be detected within 100 degree range and 0.5 degree increments

give good accuracy. The ladar transmitted data at a rate of 7 Hz using RS232.

e-- -isHay Bale



Ladar






180


Window



g 2001) Hay' bale


Angle (Degrees)

Figure 3-52. Radial distance plotted against angle (180 degree at 0.5 degree increments)









For the explanation of the ladar based path finding, consider the ladar mounted on the

tractor and scanning the hay bale track in front of the tractor from 0 to 180 degrees at 0.5 degree

increments. The Figure 3-52 shows the radial distance as a function of the angle obtained from

the ladar when scanning the test path with hay bale boundaries. The two maj or discontinuities or

depressions in the plot correspond to the radial position of the hay bales with respect to the ladar.

Error Calculation. For determining the exact location of the hay bales, a data window of

length 4 distance measurements is moved across the distance data from 0 to 180 degrees. When

all of the four positions in the window encounter a drop below a threshold value, it is decided

that that there is hay bale corresponding to the left boundary. After finding one hay bale, the

window is moved until it finds another hay bale, which corresponds to the right side boundary.

The sliding window also eliminates any point noise. Since the ladar is placed at the center of the

tractor, the distance of the tractor center from the hay bales can be calculated as

Distance = Radial Distance at the hay bale cosine (Angle at that point) (Eq. 3-15)

Using the above formula, the error of the tractor position in the path is calculated as

Error = (Distance from right boundary Distance from left boundary) / 2, in cm (Eq. 3-16)

As with the vision algorithm, the error information determined above is sent to the control

system implemented in the microcontroller using RS232. Using this error, the steering control

system tries to get the tractor back to the center of the path. The flowchart of the ladar based

guidance algorithm is shown in the Figure 3-53. The operating speed of the process of collecting

the ladar data and processing the data to find the path navigation error is about 7Hz. This speed

is limited by the RS-232 serial communication standard.






























Continue Moving
window and look
for discontinuity


Find path center as
mid point between
hay bales


Figure 3-53. Ladar algorithm flowchart

Serial communication between PC and microcontroller

RS 232 serial communication was used to send error information from the PC to the

microcontroller based on the calculations from vision and ladar algorithms. Communication was









established at 115,200 baud rate. A protocol was required for communication. The

microcontroller had the capability to read one byte at a time.

A simple protocol of sending a predefined character string followed by 4 bytes of

information was designed.

Character Sign Highest byte Middle byte Lowest byte
Figure 3-54. Serial Protocol

The character byte indicated the start of a new data. The sign byte indicated the sign of the

error, i.e. a negative sign indicated a left turn and a positive sign indicated a right turn. Using a

resolution of Icm, three bytes were sufficient to send error information up to 1000 cm or 10 m,

making the range -10 m to +10 m. A higher path error is not expected to occur in the known

environments.

Operating speed of the guidance system

The operation of the different sensors, vision based guidance and ladar based guidance

system are discussed above. It can be observed that all of these operate at different speeds. What

about the operating speed of the overall guidance system? The software for the guidance system

was developed such that each sensor is operating independent of the other. The vision process

updates the path error at a rate of about 15Hz. The steering voltage to the servo-valve is updated

at a rate of about 50Hz from the microcontroller. However, the steering update is a filtered and

smoothed update to prevent chattering of the hydraulic valve. The slowest sensor process would

be vision for vision based guidance and would be ladar for ladar based guidance. These two

primary sensors and their accompanying algorithms limit the path update speed of the overall

guidance system. Therefore, it is not a single speed that governs the entire guidance system.










Sensor Fusion Based Autonomous Steering

Performance of independent sensor based autonomous steering

Machine vision based autonomous steering. The development of an autonomous vehicle

guidance system for citrus grove navigation using machine vision based guidance systems was

discussed earlier. In custom designed test tracks, machine vision guided the tractor through the

path with a maximum error of 6.1Icm from the path center when traveling at a speed of 4.4m/s.

From the tests conducted in a test path. It was found that the vision based guidance was able to

keep the vehicle in the middle of the path in several types of paths such as straight and curved

paths. The machine vision based guidance system also guided the tractor through an alleyway of

a citrus grove, keeping the vehicle, visually, in the center of the path with no collision with the

trees. In the grove, the uneven tree canopy in citrus grove alleyways pose problems for guidance

when using vision alone. With vision based guidance configuration used in this research, for

example, with a forward facing camera, branches closer to the vehicle than the camera' s field of

view are not accounted for in the vision based guidance. Also, navigating locations near the end

of the row is not feasible with the vision system configuration as the last trees go out of the Hield

of view of the camera.

Ladar based autonomous steering. In custom designed test tracks, ladar also guided the

tractor through the path with a maximum error of 6.1Icm from the path center when traveling at a

speed of 4.4m/s. From the tests conducted in a test path, it was found that ladar based guidance

was more accurate than vision based guidance in well defined paths within a speed of 3.1Im/s. In

citrus grove conditions, the ladar mount configuration used in this research, with a wider Hield of

view and close range is useful for guiding the vehicle with a higher accuracy than vision.

However, in a given alleyway, it is not uncommon to find a few trees absent. In such cases, ladar

based guidance gives erroneous paths. Also, the short range accurate navigation does not account









for longer range changes in the path and sometimes causes j erky motion of the steering. The

ladar based steering algorithm developed in the test track is not directly adaptable to the grove

environment. While, in the test track, the path boundaries using hay bales were solid obj ects, the

tree canopy in the grove is not. This generates significant noise in the ladar data for the algorithm

to be rendered inefficient. Therefore, filtering of this noise is required when using in groves.

Need for sensor fusion based autonomous steering

The machine vision based autonomous steering method is easily adaptable from a custom

designed test track to the citrus grove environment. Its long range aids in faster decision making

before a hard turn is to be performed or for long distance detection and is quite robust to minor

variations in the path boundaries. However, it lacks the centimeter level accuracy of laser radar

in detecting the distance to close range path boundaries or obstacles. The ladar's accuracy is

useful in keeping the vehicle accurately positioned in the path. However, this accuracy makes the

information noisy if there are minor variations in the path boundaries. Fusing the complementary

information from these sensors, namely the long range information from vision and short range

accuracy of the ladar, would be beneficial.

Several methods for fusion have been reported in the literature. In our research, the tree

canopy causes the sensor measurements to be fairly noisy. So the choice of method was

narrowed down to be the one that could help in reducing the noise and also aids in fusing the

information from the sensors. Popular methods such as neural network, behavior based methods;

fuzzy logic and voting are not widely used primarily for reducing noise. Kalman filtering is a

widely used method for eliminating noisy measurements from sensor data and also for sensor

fusion. Based on the good results obtained in the previous research for fusion and noise

reduction, a Kalman filter was selected as the method to perform the fusion and filter the noise in

sensor measurements. During the development of the ladar based guidance system, the speed of









the ladar data update was about 7Hz and was limited by RS-232 serial communication. At the

time of development of this sensor fusion based approach, a high speed serial communication

card was available and so the operating speed was about 30Hz.

Kalman filter for autonomous guidance

Kalman filtering is a recursive solution to data filtering problem. In other words, it is a set

of equations that provides a recursive method to estimate the state of a process such that the

mean squared error is minimized. There are several forms of the Kalman fi1ter implementation

algorithms. In this research, one simple discrete method of implementation was used. In this

form, the Kalman fi1ter estimates a state at a given time and then obtains measurements as

feedback to improve future estimations. This could also be called as a prediction followed by

correction type method. The equations of the fi1ter performing the prediction are called time

update equations and the equations of the fi1ter performing the correction are called the

measurement update equations (Figure 3-55).


Predict

Time Update Measurement Update

Update

Initi al ize


Figure 3-55. Illustration of Kalman fi1ter operation

The time update equations are

x(k+1)= A x(k)+ w(k) (Eq. 3-17)

P(k+1) = AP(k)AT + Q(k) (Eq. 3-18)

where x is the state vector, k is the sample number, A is the state transition matrix relating

the state at sample k and k+1, w is the noise vector of the process, Q is the covariance matrix of









the process noise and P is the state estimate error covariance matrix. P gives a measure of the

estimated accuracy of the state estimate x(k+1). The 1st equation of the time update performs the

operation of estimating the state one sample ahead and the 2nd equation proj ects the error

covariance one sample ahead.

Once the time update equations are executed and the states have been predicted for the

next sample k+1 using the above equations at sample k, the system performs its operations and

reaches the next sample k+1. At sample k+1, measurements are obtained. The measurement can

be related to the state using the equation

z (k) = H x(k)+u(k) (Eq. 3-19)

where z is the vector of measurements, H is the observation matrix and u is the vector of

measurement noise. This equation is not directly used in implementing the filter, but helps in

understanding the operation of the filter. The measurement update equations used in the filter are

G(k)=P(k+1)HT (H P(k) HT + R)- (Eq. 3-20)

x(k+1)=x(k) +G(k)(z(k)-H x(k)) (Eq. 3-21)

P(k+1)=(I-G(k) H) P(k+1) (Eq. 3-22)

where G is the filter gain and R is the covariance matrix of the measurement noise. The

first step in the measurement is to compute the filter gain. The filter gain is used to make a

posteriori estimate of the state x at sample k+1, using the measurement z as given in equation 3-

21. The filter gain G is the factor used to minimize the posteriori error covariance P. Finally a

posteriori estimate of the state estimate error covariance P is calculated as given in equation 3-

22. These new estimates for x and P are used to predict the state at sample k+2 and so on.

Further details of the Kalman filter are given in Zarchan (2005).









In the first step to use the filter, a model of the real world must be created by using a set of

equations in the state space form. For this, let us first define some variables for this research.



ir ir ~Pathbudn


Vehicle

iPath
Cente
Figure 3-56. Vehicle in the path with the state vector variables

As mentioned before, the aim is to filter the data from both vision and ladar to remove

noise and to use the complementary information obtained from them for better navigation.

Therefore, the error of the vehicle from the path center is the primary variable being considered

for fusion and filtering. In the Figure 3-56, the vehicle is shown in the path at a certain possible

arbitrary position to the left of the path center at time step 't'. From concepts in physics, it is

known that given a position of a vehicle, to predict the position of the vehicle after some time, it

is necessary to know the velocity of the vehicle and the direction of travel. In this proj ect, the

velocity of the vehicle can be obtained using an ultrasonic speed sensor on the tractor and the

direction of the vehicle can be obtained using the IMU mounted in the tractor.

Time update model. From the Figure 3-56, let the path error be represented by 'd' C R,

the forward velocity of the vehicle 'v' C R and the change in heading be AO C R. The path

error at the next time step 't+1' can be written as

d(t+1)= d(t)+ v sin(AO) (Eq. 3-23)









Depending on the direction of the vehicle, the path error could either increase or decrease.

The operating speed of the vision algorithm is about 15Hz and the ladar algorithm is about 30Hz.

The sensor fusion based guidance update would be limited by the operating speed of the vision

algorithm which is 15Hz. The common operating speed of a tractor in the grove is about 3.1m/s.

Comparing these values, for one sample update, the vehicle would travel a distance of 3.1m/15 =

0.2m or 20cm. For a forward travel of 20cm, the heading of the vehicle is not expected to change

significantly. The required heading of the vehicle, O R C R is also not expected to change

significantly. The tractor does not possess speed control and so the tractor was expected to be

operated at a constant speed during the experiments. Therefore, the heading (0) and the speed(v)

cannot be expected to change significantly between successive time steps. Therefore it would be

reasonable to state that

6 (k+1) = B (k) (Eq. 3-24)

v(k+1) = v(k) (Eq. 3-25)

B R(k+1) = B R(k+1) (Eq. 3-26)

The heading and the velocity directly affect the path error 'd', it is reasonable to filter out

noise from these measurements as well. The required heading of the vehicle can be used to limit

the overall direction of travel and so it is an important variable. Therefore, all of these variables

were included in the state vector x C R4 Of the Kalman filter.

x(k) = [d(k) B (k) 6 R(k) v(k)]T

Based on the above reasoning, the state transition matrix, A(k) C R4x4 iS defined as

1 0 0 t *SinAO
010 0
001 0
000 1









The change in heading AO is calculated using the measurements from the IMU at each

time step. As observed, this is a measurement value obtained from a sensor. The use of a variable

sensor input such as AO in the state transition matrix is consistent with the method described in

Kobayashi et al. (1998). It is to be noted that only the change in heading is used for estimating

the position error, which affects the overall guidance. The absolute heading is included in the

filter to remove noise from the IMU measurements.

The process noise w is assumed to be white noise with a mean value of zero. Hence w=0.

The estimation process is difficult to directly observe, hence the process noise covariance is

uncertain. Therefore the initial variance values, which are the diagonal elements in the matrix,

were assumed as follows: variance for process position noise 2cm, variance for process current

heading noise 0.01degree, variance for required heading noise 0.01degree and variance for

process velocity noise 10-4 m/S. These values were intuitively assumed to be the amount of noise

variance that the process could add. As discussed further in this paper, the process noise

covariance matrix Q, is updated at each step of the operation of the Kalman filter. Therefore, the

initial assumptions of the matrix values do not significantly affect the overall guidance of the

vehicle. Kalman filter can work even when the precise model of the system is not available.

These sensor measurements were assumed to be independent of the other sensors. This is done

for simplicity and later it is observed from simulations and experiments that the guidance of the

vehicle, with this assumption, is accurate enough to keep the vehicle close to the alleyway center.

Hence the covariance values, which are the off-diagonal elements, are zero.

qx 0 0 0 2 0 0 0
0 q, 0 0 0.01 0 0
0 0 q, 0 0 0 0.01 0
0 0 0 q, O 0 0 10-4










Using this model of the process and the values for the parameters, the time update

equations of the Kalman filter can be implemented. Kalman filter can be considered as a subset

of the statistical methods because of the use of statistical models for noise.

Measurement update model. To implement the measurement update equations, the

measurement model is required. The measurements obtained from the various sensors and which

affect the variables in the state x are the lateral position error of the vehicle in the path from

vision algorithm, denoted by xc(k) C R, the lateral position error of the vehicle in the path from

ladar algorithm, denoted by xL(k) C R, the required heading of the vehicle determined from

vision algorithm, denoted by B c(k) C R, the current heading of the vehicle measured using IMU,

denoted by B IMu(k) C R and the vehicle linear velocity measured using the speed sensor,

denoted by v(k) C R. Therefore, the measurement vector z C R can be given by

z(k) = [xc(k) xL(k) B JUu(k) 6 c(k) v(k)]T

Comparing the variables in x and z, it is observed that the path error 'd' is affected by the

measurements xc and xL. The current heading of the vehicle B is affected by the

measurements 6JUn and the required heading 6 R is affected by the measurement B c. Therefore

the observation matrix H can be defined as

11000
00100
H=
00010
00001

The measurement noise, denoted by u(k) is assumed to be a Gaussian with zero mean with

standard deviations estimated experimentally with the different sensors, i.e. u =0. To determine

the measurement noise covariance matrix R, the vehicle was set stationary in the middle of the

path, each sensor (except the speed sensor) mounted on the vehicle was turned on independently









and the information from the sensors was collected for 30 seconds. For the camera, the vision

algorithm (Subramanian et al., 2006) was run for 30 seconds and the path error and the heading

angle determined was recorded. For the ladar, the ladar path segmenting algorithm (Subramanian

et al., 2006) was run for 30 seconds and the path error measurements were recorded. For the

IMU, the sensor was run for 30 seconds and the heading angle measured by the sensor was

recorded. For the speed sensor, the vehicle was run at 3 different constant speeds and the speed

sensor measurement was collected for 30 seconds. The measurement noise covariance matrix,

denoted by R, was determined from these recorded measurements and was found to be

a, 0 0 0 0 1.07 0 0 0 0
0 o, 0 0 0 0 0.15 0 0 0
R = 0 0 a, 0 0 =I 0 0 0.0017 0 0
0 "n,,O 0 0.0001 0



where the variance of the error in lateral position of the vehicle, denoted by az, was

determined from the vision algorithm measurements, the variance of the error in lateral position

of the vehicle, denoted by a,, was determined from ladar algorithm measurements, the variance

of the error in heading, denoted by a,, was determined from vision algorithm, the variance of

the error in heading, denoted by Ga,,,,,,was determined from the IMU measurements and the

variance of the error in velocity denoted by &,,, was determined from the speed sensor

measurements. xc and 6 c are measurements from the same camera. However, the vision

algorithm uses different regions of the image to estimate them. XC, XL, B IMU and v are

measurements from different sensors operating independent of each other. These measurements

were assumed to be statistically independent. Therefore, the different sensor measurements were

assumed to be statistically independent of each other. Hence the covariance values are zero. The










variance of the velocity measurements was zero. This can be attributed to the low resolution

(1mph) of the speed sensor. Using the above parameters, it is possible to implement the

measurement update equations.

Reliability factor of primary guidance sensors in the Kalman filter and fuzzy logic sensor
supervisor

Reliability of the sensors. A problem in using a simple Kalman filter in our research is

that the reliability of the information from the sensors depends on the type of path in the grove.

As discussed previously, ladar is accurate at short range for the given mounting arrangement and

machine vision is good at providing overall heading of the vehicle. By experimentation in a

citrus grove, the following observations were made: tree foliage is highly variable, trees can be

absent on either or both sides of the path and some trees can be small enough for ladar to not

recognize them as trees. Therefore, if a white noise model is assumed for the measurements, the

reliability factor of a sensor in the Kalman filter has to be constantly updated. In such a variable

path condition, it is not feasible to have constant noise values for the ladar and vision i.e. have

constant measurement noise covariance matrix R. The covariance matrix R tells the Kalman

filter about how believable or true the sensor measurements are. If the covariance value of a

measurement is low, the filter believes the measurements to be accurate or reliable and if high,

the filter believes the measurements to be erroneous. This type of update can be used to update

the filter on the nature of the sensor at different locations of the grove. For example, when the

ladar does not recognize trees on either side, guidance is more reliable when the measurement

noise for the ladar is increased to a high value such that vision takes over as the reliable guidance

sensor and for example when the vehicle gets to the end of a tree row, vision is not useful

anymore as the citrus trees are absent from the field of view of the camera and so ladar can be

made the sole guidance sensor to cross the last tree by increasing the measurement noise










corresponding to vision measurements. A method was required to perform this update to R based

on the presence of trees in the field of view of the sensor. A fuzzy logic system was used to

decide the reliability of the sensor. Fuzzy logic was chosen as the method because it is intuitive

to implement and the knowledge to implement fuzzy logic was available from a previous

research. The reliability is changed in the Kalman filter by changing the measurement noise

covariance matrix R. This is discussed in the following section.

Fuzzy logic inputs. A fuzzy logic based supervisor was used to decide which sensor is

more reliable at different locations in the grove alleyway. The input to the fuzzy logic supervisor

is the horizontal distance of the vehicle center line from the trees on either side of the vehicle.

This distance is in a direction perpendicular to the direction of motion of the vehicle. Both

vision and ladar algorithms input their corresponding distance values to the fuzzy logic

supervisor. Altogether, there are four input values; vision left tree distance, vision right tree

distance, ladar left tree distance and ladar right tree distance. These inputs are sent to the

Fuzzifier. The structure of the overall fuzzy logic supervisor is shown in the Figure 3-57.

Vision Left In ren e | Vision
VisonRigt uzzifier Defuzzifier
Ladar Left Lar
Ladar Right LIII Rule Base | Both



Figure 3-57. Fuzzy logic implementation

Fuzzy quantification of knowledge. These inputs are then divided into three linguistic

variables: reasonable, unreasonable and zero. The input membership functions are shown below.

The horizontal axis is the distance values and the vertical axis is the corresponding normalized

value on a scale of 0 to 1. The linguistic variables were chosen based on the input distance

values.









Error!


Im 2m 3m

Figure 3-58. Membership function for linguistic variables

The input membership function relates the input x, which is the distance value, to the

linguistic variables reasonable, unreasonable and zero. The meaning of the linguistic variables is

literally whether the distance from the tree row is reasonable or not. A zero membership for the

ladar indicates the presence of obstacle in the path and for vision, the end of the row. x takes the

value of zero for 0 to Im, Im to 3m as reasonable and any value greater than 2m as

unreasonable. As mentioned earlier, the vision and ladar sensors are mounted along the center

line of the vehicle. If the vehicle were properly navigating the alleyway, the distance of the row

trees from the sensors should be a minimum of Im to account for the vehicle's width and the

trees should be at a distance of maximum of 3m to be present in the row in which the vehicle is

navigating. Therefore, the reasonable range for a row tree would be Im to 3m.

Im< x < 3m : Reasonable distance of tree (Eq. 3-27)

If the tree is detected beyond a distance of 2m, the tree is unlikely to correspond to a row

tree in the current row and is possibly a tree in an adj acent row. Therefore, this is an

unreasonable distance for a row tree.

x > 2m : Unreasonable distance to tree (Eq. 3-28)

If in the vision algorithm, no tree is detected in the field of view or if a tree is detected in

the middle of the path, then this is likely to be a case of the vehicle coming to the end of the row

or near the headland. If a tree is detected in the middle of the path, it is likely to be a headland









tree not part of the row. Detecting a tree in the middle corresponds to a horizontal distance less

than Im from the vehicle center line. This corresponds to the linguistic variable zero. Not

detecting any tree is also placed in the linguistic variable zero for convenience. If in the ladar

algorithm, a tree is detected at a distance less than Im, it is likely to be due to the presence of

obstacle in the path. This is also assigned the linguistic variable zero.

x < m : Zero (Eq. 3-29)

Suppose for example, that the values of x at an instant are

x(left vision) = 1.5m and x(right vision) = 2.5m

x(left ladar) = 1.5m and x(right ladar) = 1.5m

Normalization of the linguistic variables was performed at a scale from 0 to 1. Therefore,

the normalized values are

x(left vision,normal) = [1,reasonable]

x(right vision,normal) = [0.5,reasonable] and [0.5, unreasonable]

x(left ladar,normal) = [1,reasonable]

x(right ladar,normal) = [1,reasonable]

This gives the normalized possible values for each variable after fuzzification. Here we

have 5 possible values. In general, there could be examples where we could have a minimum of

4 possible values or a maximum of 8 possible values. It is complex to specify rules for all these

possibilities. It is also not necessary to analyze all these possibilities separately.

If a sensor performance is degraded, then it should degrade the measurements for both left

and right side measurements. In the above example, x(right vision,normal) has 50% chance of

being reasonable or unreasonable and x(left vision,normal) is 100% reasonable. In such a case,

we make a premise to group the possibilities for both reasonable and unreasonable cases and










reduce the linguistic variable for each sensor as a whole. That is, combining the left and right

variables for vision, we have two possibilities for reasonable (1 and 0.5) and one possibility for

unreasonable (0.5). If we combine the possibilities for reasonable and choose the minimum value

for reasonable, we get

x(vision, normal) = [0.5,unreasonable] and [0.5, reasonable]

If two unreasonable values were present, the higher value of those would be chosen as it is

better to consider the system to be erroneous rather than accurate. Similarly for ladar, we have

x(ladar, normal) = [1,reasonable]

We are now left with 3 possibilities. By performing this operation of grouping the

possibilities for each sensor, we are always guaranteed to obtain only a maximum of 4 linguistic

variables for the four distances. These values are used with the rule set.

Table 3-4. Fuzzy logic rule set for sensor supervisor
Vision Left/Right Ladar Left/Right Decision

Reasonable Both

Reasonable/Zero Ladar higher

Zero Stop
Reasonable
Unreasonable Vision

Unreasonable/Zero Ladar

Reasonable/Unreasonable Vision higher

Unreasonable Ladar

Reasonable/ Zero Ladar higher

Zero Reasonable Ladar

Unreasonable/ Zero Ladar

Reasonable/Unreasonable Both










Rule base. The rule set for the supervisor is shown in Table 3-4. Based on the linguistic

variables chosen from vision and ladar, the corresponding rules are fired. For our example, the

rule corresponding to vision-unreasonable and ladar-reasonable is first fired and the decision is

to use ladar. The second rule of vision-reasonable and ladar-reasonable is also fired and the

decision is to use both. From the rules fired, the output membership function is chosen. The

membership value is used to find the crisp value for the output. The output membership function

decision relates the rules fired to the linguistic variables vision, ladar and both. The output

membership function is shown in the Figure 3-59. These variables represent the decision made as

to whether vision or ladar is given higher priority or if both should be given equal priority. As an

exception, during the presence of an obstacle, the ladar takes precedence till the obstacle is

avoided. The linguistic answers to the problem are not useful to implement on a computer. So a

numerical answer is required. For that, the crisp value of the decision is taken as the

measurement noise covariance value for a sensor.



Vision Both Ladar





-3cm -1 0 0.2 2cm
Decision

Figure 3-59. Crisp output membership function

Premise quantification. For our example, for one rule, we obtained a decision ofladar

with the values 0.5 and 1 and for another rule, a decision of both sensors with values 0.5 and 1.

That is, we have two possibilities for each decision. We now have to determine which conclusion

should be reached when the rules that are on are applied to the final decision making. To reduce

the possibilities, we then use a premise of minimum to infer the decision. That is,










Min(0.5, 1) of ladar = 0.5 for using ladar

Min(0.5, 1) of both = 0.5 for using both

This leaves just one possibility for each decision. The next step is to obtain the numerical

value.

Inference. Defuzzification was done using the center of gravity method. The center of

gravity method is used to obtain a crisp value of output using the output membership function.

The center of gravity method is given by



Ucrisp (Eq. 3-30)



where bi is the center of the output membership function where the decision reaches its


peak value and Im, is the area spanned in the decision function. For our example, one decision

of using ladar has been obtained. Therefore, we have to raise the measurement noise for vision

and so we use the output membership function corresponding to vision. For the decision of using

both, we use the output membership function corresponding to 'both'.


Vision Both Ladar





-3cm -1 0 0.2 2cm
Decision

Figure 3-60. Illustration of inference for the example

The Figure 3-60 shows how the crisp value is obtained. For the first decision of using ladar

with the value of 0.5, the noise covariance value of vision in R has to be increased. The dashed

line illustrates the value of 0.5 in the vision membership. Similarly, the dotted line indicates the

value of 0.5 in the 'both' membership. 'bi' for vision is 1 (or -1) and for both is 0. The positive









or negative values only indicate whether the decision is vision or ladar. Only the absolute values

are used to update R. m, is the area under the dashed line and dotted line for each. Therefore,

Ucrisp = (1)(1 .375)+(0)(0.45) / (0.45+0.975) = 0.75

This crisp value is the new value of covariance for vision in determining the path error.

Although the crisp value is positive, since the decision was between vision and both, the crisp

value is used to update vision only. The range of crisp values for the decision was chosen by

performing simulations and determining what values provide good results. Using the Fuzzy logic

supervisor described above, the suitable sensor at each location is chosen and the updates are

made in R.

Divergence detection and fuzzy logic correction

The use of a Kalman filter with fixed parameters has a drawback. Divergence of the

estimates is a problem which is sometimes encountered with a Kalman filter, wherein the filter

continually tries to fit a wrong process. Divergence occurs due to the inaccuracies in the

covariance values specified in the model. Divergence may be attributed to system modeling

errors, noise variances, ignored bias and computational round off errors (Fitzgerald, 1971).

Often, a white noise model is assumed for the process, while implementing the Kalman filter.

White noise models are not true models of the process for autonomous guidance applications. It

is often used to simplify the implementation. If a white noise model is assumed for the process,

divergence, if encountered, is difficult to correct. Mathematical methods such as altering the

rank of the covariance matrix can be used to reduce divergence. Abdelnour et al. (1993) used

fuzzy logic in detecting and correcting the divergence. Sasladek and Wang (1999) used fuzzy

logic with an extended Kalman filter to tackle the problem of divergence for an autonomous

ground vehicle. They presented simulation results of their approach. The extended Kalman filter










reduced the position and velocity error when the fi1ter diverged. The use of fuzzy logic also

allowed a lower order state model to be used. For this research, fuzzy logic based method is used

with Kalman filtering to overcome divergence, as it is a proven method for divergence correction

in previous researches mentioned above. It was chosen over mathematical methods of correction,

as it was felt that rigorous mathematical analysis of the filter and model was beyond the scope of

this dissertation.

A common method to detect divergence is using the innovation vector. The innovation

vector z'(k) (Klein,1999), is defined as the difference between the measured vector and the

estimation of the measurement vector as given by the equation

z'(k)= z(k) H x(k) (Eq. 3-31)

The innovation vector is also called residual. For an optimum Kalman filter without

divergence, the innovation vector is a zero mean Gaussian white noise (Abdelnour et al., 1993).

Therefore, the performance of the Kalman filter can be monitored, using the value of the

innovation vector. A sample mean from 10 successive values of z' was used for detecting

divergence. This sample corresponds to a time interval of approximately 300 ms. At the vehicle

operating speeds of 3.1 m/s, this sample sequence was expected to be sufficient. However,

initially, while starting the vehicle in autonomous mode, the required samples may not be

available. Simulation results for the chosen sample size were found to be satisfactory. The

analysis of the variation in performance with different sample sizes was not performed.

Deviation of z' from zero by more than 5% was taken as the indication of reduction in

performance of the filter leading towards divergence.

Divergence can be corrected by updating the process noise covariance matrix Q, based on

the innovation vector. Fuzzy logic (Passino and Yurkovich ,1998; Subramanian et al., 2005) was










used to perform this update for two parameters of Q namely qx and q BR. The fuzzy logic system

is shown in the Figure 3-61.


zV- Z. Inference O
Fuzzifier Engine Defuzzifier cx
Z XL
RuleR


Figure 3-61. Fuzzy logic implementation to correct divergence

The xc, xL and B parameters of the innovation vector z' are the inputs to the fuzzy logic

system. Among the various parameters used in the filter, these are the three parameters whose

process noise is less precisely known as they depend entirely on the algorithm developed. The

other two parameters of heading from IMU and the velocity are obtained directly from the

sensors and so their process noise is not expected to change significantly during operation. The

output of the fuzzy logic system are the updates to Q namely qx and q8R which correspond to

path error and required heading.


Negative Positive





-1500 -500 00 500 1500


Figure 3-62. Input membership function

The Figure 3-62 shows the input membership function. The ranges of the input are from -

15% to 15% of the values of the innovation vector change from zero. These maximum and

minimum values were chosen by trial and error by performing simulation. The inputs are

converted to linguistic variables negative and positive as shown. These linguistic variables

indicate the nature of the input which is either negative or positive. Normalization of the










linguistic variables was performed at a scale from 0 to 1. A set of if-then rules were developed

by experimentation. Each rule received a corresponding normalized membership value from the

input membership function and thereby the linguistic variable. Based on the linguistic variables

chosen from each z', the corresponding rules are fired. The rule set is shown in the Table 3-5.

Table 3-5. Fuzzy logic rule set for divergence correction
z x z' oc Qx 98R


Negative Negative Zero Zero


Negative Positive Zero Positive


Positive Negative Positive Zero


Positive Positive Positive Positive



For example, consider the second row in the Table 3-5 where z'x is negative and z' Bc is

positive, this indicates that the error in position has reduced compared to the previous time step

but the vehicle is heading in the wrong direction, then the process noise for position qx is zero

and the process noise for heading q8R is positive. In the rule set, a single measurement value for

position, z'x, is given instead of the two different values, z' xe and z' XL This is because, the

sensor supervisor described earlier, picks one of the sensors for use in the filter when the other is

unreasonable. In the cases where one sensor is given higher priority, the position measurement of

that sensor alone is used in the rule set and in the cases where both sensors are selected by the

supervisor; the position measurement of ladar alone is used as it has lower measurement noise.

This reduction was performed to reduce the complexity of the rule set.

















X=0 9 (x, 8 R) X=4
B =0 0=0.l

Figure 3-63. Output membership function

From the rules fired for a given set of data, the output membership function is chosen. The

membership function is shown in the Figure 3-63. The membership value was used to find the

crisp value for the output. Defuzzification was performed using the center of gravity method

(Passino and Yurkovich, 1998). The crisp values of the process noise covariance are then

updated in the process noise covariance matrix Q. The range of crisp values was chosen by trial

and error by performing simulations and determining what values provide good results.

For the parameters, qoc and qv, to update these covariances, a linear relationship was

followed such that q 0 and qv were increased proportional to Iz'-z of 6 c and v respectively.

This was performed independent of the fuzzy logic correction.

Sample simulation result of the sensor fusion method

The developed fuzzy logic enhanced Kalman filter algorithm was simulated in MATLAB

to check the functionality and accuracy. The simulations also aided in tuning the parameters of

the filter to improve the performance. The data required for simulation was obtained by manually

driving the vehicle through a citrus grove alleyway and collecting all the sensor information in a

text file. The data was then imported in to MATLAB to run the simulation. Program was written

to simulate the filter. For the simulation, the data from the sensors collected at each instant was

input to the program one set at a time to mimic the time varying nature of the data. The output of

the simulation was the path error of the vehicle obtained from the fusion method by simulation.










This path error was compared with the path error calculated by the vision and ladar algorithms.

All the three data were plotted. The Figure 3-64 shows one case of the simulated result of fusing

the error obtained from vision and ladar algorithms. The error is the distance between the center

of the vehicle in the path and the path center.







10~
25B16Lnn

::j;Dstanc (m)t A
Fiur 3-4 iuainrsl ffso ferr bandfo iinadldragrtm
The igue 364 onssts f treediferet plts upeimpsedfor omprisn. he lot

are of: viinbsdpt roldrbsdpt roradsmltdfso ae aherr o

the clarit in coprsno hepos nya eto fte ahfo itaneo 62mt68
of tave isshon. I isobsrve tha frm 1.2mto 6.35, oly isin prvids rasoabl










Fogr te last. 25cmuai rs of thesecion fro 16.55m otoe 168m oth vision and ladar determinethesam


ameoun vsof aedpt error. hrfr, the bsdpahero n simulate uion reul opened heesosteiproeeth inr path


oftracking using fusionbaservd guianc as compared wth visimonorlada basedn guiderasnce.









Headland Turning and Navigation

The development of autonomous vehicle guidance systems using machine vision and laser

radar for navigating citrus grove alleyways using independent sensors and using fusion of the

sensors was discussed in previous sections. This section discusses the methods developed for

navigating the headlands of citrus groves.

Headland in citrus groves is the space available at the end of each tree row. This space is

generally used for turning the vehicles after navigating the rows or alleyways. Headlands are

characterized by a ground space of at least 4.25m after the last tree in each row. The ground is

usually covered with grass. Beyond this space, there may be trees planted to act as boundaries of

the grove or there could be more open ground or some combination of both. Complete navigation

of a citrus grove requires the vehicle to turn at headlands in addition to navigating the alleyways.

The vehicle may not perform any task on the citrus trees but the time spent in navigating the

headland is important for the overall operation. Miller et al. (2004) have reported that the space

required to turn at the headland and the time spent in turning, significantly affect the field

efficiency. If a lot of space is used by the vehicle at the headland for turning, the useful tree

space is reduced. This reduces the overall yield of the grove. If a lot of time is spent on turning

the headland, the efficiency of operation of the autonomous vehicle is reduced. Therefore time

and space are important factors for executing headland turning maneuvers.

The vehicle used for developing the headland turning algorithms was the John Deere

eGator. The sensors used for developing the headland navigation algorithms are the camera, the

ladar, the IMU and the wheel encoder. The details of these sensors and their mounting location

were described earlier. The camera is mounted looking down at an angle of 10 degrees with the

horizontal and the ladar is mounted looking down at an angle of 15 degrees with the horizontal.

The camera, ladar and the IMU are interfaced with the 2.4GHz CPU running Windows operating










system. This is the same computer that was used on the tractor. All high level algorithms for

processing guidance sensor information and guidance algorithms are implemented in this

shoebox computer. The speed control and the steering control are implemented in the PC 104

computer described earlier. The steering control system used in the autonomous steering of the

tractor was the PID control, in which lateral position error of the vehicle was given as input to

the controller and steering was suitably controlled to minimize the error. In alleyway navigation,

the specification of the lateral position of the vehicle in the path can be described as error of the

vehicle from the path center. For headland navigation, there is no specified path that the vehicle

must take. Therefore it is not practical to specify a path error in all conditions. Also, the

development of classical controllers such as PID, a mathematical model of the system is useful

for tuning the controller parameters. For the tractor, this model was experimentally determined.

Development of a similar model for the eGator is time consuming. Therefore, a new type of

steering controller was explored which does not require error specification or mathematical

model, which was the Pure Pursuit steering.

Pure pursuit steering

Pure Pursuit steering mechanism has been used for many years in the Carnegie Mellon

NavLab autonomous vehicle proj ects and has been reported to be robust. The method is simple,

intuitive, easy to implement and robust. It is analogous to human driving in that humans look a

certain distance ahead of the vehicle and steer such that the vehicle would reach a point at the

look-ahead distance. The Pure Pursuit method is similar to the human way of steering. Pure

Pursuit is a method for geometrically calculating the arc necessary for getting a vehicle onto a

path (Coulter, 1992). Consider the illustration of the vehicle with the X-Y coordinate system

shown in the Figure 3-65.










(p, q)


Figure 3-65. Pure Pursuit steering method

Let the origin of the coordinate system be the center of the vehicle. Let (p, q) be a point

some distance ahead of the vehicle. Let an arc j oin the origin and the point (p, q). Let L be the

length of the cord of the arc connecting the origin to the point (p, q) and let R be the radius of

curvature of the arc. Let R be expressed as a sum of distance p and some distance D.

R=D+p =>D=R-p (Eq. 3-32)

L, p and q form a right triangle. Using Pythagoras theorem for a right triangle

p2 q2= L2 (Eq. 3-33)

q, R and D for a right triangle. Using Pythagoras theorem,

q2 + D2 = R2 (Eq. 3-34)

=> q2 + (R-p)2 = R2

Simplifying this equation,

L2 2 q
R + (Eq. 3-35)
2p 2 2p

As this method is analogous to human driving, we can choose a predetermined look-ahead

distance 'p'. For this proj ect, the look-ahead distance was chosen to be 2.89m from the front of










the vehicle. For alleyway navigation, the error of the vehicle from the path center is 'q' at the

look-ahead distance p For non-alleyway navigation, 'q' can be the required lateral movement of

the vehicle at the look-ahead distance. This value of 'q' is obtained from the machine vision and

ladar algorithms for path segmentation. Using these values of 'p' and 'q', R can be calculated

from the above equation. R is the required radius of curvature of the vehicle to get the vehicle to

the required point. In the CAN based steering system of the e-Gator, steering has to be specified

as steering angles and not as radius. Therefore, R has to be converted to steering angle.

A calibration experiment was conducted to find a relationship between the radius of

curvature and the steering angle. In the experiment, the eGator was taken to an open area.

Steering angle was kept constant by sending CAN commands to the steering controller. The

vehicle was moved forward by manually stepping on the accelerator. For a constant steering

angle, the vehicle moved in a constant circle. Markers were placed on the ground at the inner

wheel radius at different locations. The diameter of the circle was manually measured using a

tape measure. The steering angle was varied to obtain different radius of curvature. The

illustration of the experiment is shown in the Figure 3-66 and the experimental results are shown

in the Table 3-6.





Vehicle






Figure 3-66. Illustration of experiment for steering angle calibration with radius of curvature










Table 3-6. Calibration of radius of curvature of vehicle with steering. angle
Steering Angle (Degrees)

10 15 20 25 30

Left 10.6 9.9 7.46 5.8 4.4
Radius
(m) Right 7.9 6.4 5.33 4.6 3.8




Experiments were conducted for steering angles from 10 degree to 30 degrees at 5 degree

intervals. The lowest angle chosen was 10 degrees because smaller angle had much larger radius

that caused the vehicle to travel out of the experimental area. The maximum angle of 30 degrees

is the physical limit of the vehicle's steering system. Experiments were conducted for both left

and right turns. The experimental values were plotted in the chart shown below.


+ Left
aRight
-Linear (Left)
Linear (Right)






\ \ ~y = -0 33x + 14 232


B y = -0 2x + 9 606


15 20
Steering Angle (Degrees)


Figure 3-67. Line fitting for calibrating radius of curvature of vehicle with steering angle










From the chart, linear equations were fit to calibrate steering angle with radius of

curvature. The equations are:

Left tumn: Radius = -0.33* Steering angle + 14.232 (Eq. 3-36)

Right tumn: Radius = -0.2 Steering angle + 9.606 (Eq. 3-37)

Rearranging the terms,

Left tumn: Steering angle = -3.03* Radius 43.12 (Eq. 3-38)

Right tumn: Steering angle = -5 Radius 48.03 (Eq. 3-39)

Using these calibration equations, the steering angle required for various radius of

curvature was determined. The steering angles were sent to the steering controller. It is observed

that the minimum radius of curvature of the vehicle was 4m. This minimum value limits the

smallest turn that the vehicle can execute in the headlands. The flowchart of the Pure Pursuit

steering algorithm is shown in the Figure 3-68.


Obtain required vehicle offset
from vision or ladar




Use look-ahead distance and offset to
calculate radius of curvature



Use calibration equations to
determine steering angle from radius
of curvature




Send steering angle commands
to steering controller



Figure 3-68. Flowchart of Pure Pursuit steering algorithm










Headland turning maneuvers

As mentioned before, the amount of time spent and the area required by the vehicle to turn

at the headlands is important. A headland turning maneuver can be performed as a series of the

following steps: 1) Determining the approach of the end of current row while navigating the

alleyway. 2) Precisely establishing the end of the row. 3) Turning into the headland. 4) Turning

into the next row. 5) Getting to a position to navigate the next row.

Determining the approach of the headland. The first step of determining the approach of

the headland while navigating the alleyway is performed using machine vision. Consider the

Figure 3-69 which shows the image obtained from the camera when the vehicle was in the

middle of the alleyway A) and the image obtained from the camera when the vehicle was close

to the end of the alleyway B) and C).


k. I *r"
w .L
_,~~~...-~- --------~-t ,~~ ,w


I -- ~l~bc I r C
Figure 3-69. Image obtained from the camera when the vehicle is navigating the alleyway









The segmentation algorithm used originally for path determination for headland

navigation, segments the entire image for trees and pathway. When the segmentation algorithm

is run on the above images, the resulting images are as shown below.


























Figure 3-70. Segmented images

It is observed from the segmented images that when the vehicle is close to the end of the

path, the upper half of the segmented image is predominantly either white or black depending on

whether trees are absent or present in the headland. For the case when the vehicle is in the

middle of the alleyway, the upper half of the image is both black and white without any color

being predominant. This difference in the upper half of the image for close to headland cases and

middle of alleyway cases is used in the vision algorithm.

The headland determination vision algorithm uses two fixed boxes, large/green and

small/red as shown in the Figure 3-71.






























-~%_ C
Figure 3-71. Vision algorithm example for headland detection

The two angular lines starting from the lower corners of the image and extending to the

corners of the lower box indicate the path boundaries as obtained from path segmentation

algorithm. For the case in the middle of the alleyway, the smaller box falls on the pathway

whereas the bigger box encompasses random obj ects including the headland (A of Figure 3-71).

When the vehicle is close to the headland, the two boxes fall on headland obj ects (B and C of

Figure 3-71). As observed in the images, the headland often consists of either trees or open

space. The use of the boxes is shown in the segmented images below.























Figure 3-72. Use of boxes in the segmented images

Close to headland, the pixels in the two boxes are both classified as either trees or pathway

whereas in the middle of the alleyway case, the two boxes are classified differently. Based on

this difference in the boxes, the algorithm determines the approach of the headland.

By trial and error, it was found that using this vision based method of detecting the

approach to the headland, detection of the headland is always determined when the vehicle is at a

distance of approximately 29m to 31Im from the headland. As observed this distance is not

accurate and the variation is from row to row. This is however a good estimate of the distance to

the headland. Once the approach of the headland has been detected by the vision algorithm, this

distance is used to dead reckon the vehicle through rest of the alleyway till the headland has been

reached. A flag is sent to the PC104 computer indicating that the headland has been detected. In

the PC104 computer, dead reckoning is performed using the information from the encoder. The

equation for the dead reckoning process is given as

Dc = Dc + de (Eq. 3-40)

where Dc is the current distance and de is the change in the distance measured by the

encoder.

Also, when the headland is detected using the machine vision algorithm, the priority of

ladar in the sensor fusion process is increased and the priority for vision is reduced to zero. This









is because, close to the headland, the image from the camera primarily consists of headland

obj ects. This is also due to the long field of view of the camera. The vision algorithm

development was based on segmenting trees and path. Close to headlands, this segmentation is

not feasible as the alleyway trees are out of the camera' s field of view. On the other hand, ladar

with the short field of view is able to detect the last few trees in the alleyway and perform the

navigation. Therefore, navigation is switched entirely to ladar.

Precisely establishing the end of the row. As mentioned earlier, the vision based

headland detection gives an estimate of where the headland is located and the dead reckoning

allows the vehicle to get close to the headland. A midsize citrus tree with some leafy canopy has

a width of a minimum of 2m. The estimated distance using vision followed by dead reckoning

allows the vehicle to get to a position such that the last tree is to the side of the vehicle. Once the

dead reckoning is completed and the vehicle has reached the estimated distance near the last tree,

the vehicle is moved further forward while the ladar data is monitored. This progress is

continued until the ladar algorithm does not detect any trees on either side. When this happens,

the vehicle is assumed to have crossed the last tree. This ladar monitoring ensures that the

vehicle crosses the last tree of the alleyway.

Drawback of the vision based headland detection. In the vision based headland

approach detection algorithm described earlier, the assumption was that the headland consists

either of uniform headland trees or open space. The caveat with the vision based headland

detection is that the headland is not always composed entirely of trees or open space. A sample

image is shown in the Figure 3-73.





















Figure 3-73. Image where the vision based headland detection can fail

In the above image, a single tree is present in the headland along with an artificial obj ect.

In such a case, the vision based algorithm fails to precisely detect the headland. Presence of

random obj ects makes the vision algorithm less accurate at determining the distance to the

headland. In the previously described method, the vision based headland detection with dead

reckoning helped the vehicle reach the last tree of the alleyway. With the failure modes of the

vision based detection, the estimate of the headland may not be enough to reach the last tree or

may take the vehicle beyond the last tree. To overcome this problem, a ladar based algorithm is

described as follows.

Precisely establishing end of the row using sweeping ladar. The currently used mount

angle of the ladar is such that it is able to detect only the closest tree to the vehicle. It was

proposed that if the ladar is rotated through a range of angles, many more trees could be detected

in front of the vehicle. The precise distance measuring capability of the ladar would help in

establishing the end of the row. In this method, a digital servo motor (HiTec HSR-5995TG) was

custom attached with the ladar. A custom designed power supply regulator was built to power

the motor. A PIC microcontroller based serial communication interface was also custom built to

interface with the motor. The motor-ladar mount is shown in the Figure 3-74.





















A B
Figure 3-74. Ladar and motor assembly for sweeping. A) view 1. B) view 2.

The motor was attached to the ladar body such that when the motor shaft rotated, the ladar

was oscillated up and down about the horizontal plane. The ladar was swept from an angle of

15deg. above the horizontal to 30deg. below the horizontal at a speed of approx. 15deg/s (Figure

3-75). The range of angles was chosen such that the ladar sweep covered only the tree locations.

Angles beyond 15 degrees above the horizontal resulted in the ladar sweeping over the trees and

the angles beyond 30 degrees below the horizontal resulted in the ladar sweeping the ground.

The angles were also limited by the range of the mechanism linking the motor to the ladar. The

speed of rotation was a limit of the servo motor.








3 --' 150-.










Figure 3-75. Illustration of the ladar sweep









As mentioned earlier, the ladar refresh rate is approximately 38Hz. From the motor sweep

speed and the ladar refresh rate, data available per degree is approximately


-2.53 = 2 full sweeps per degree


Therefore, a ladar scan is obtained every half a degree at this sweep rate of the motor. In

this sweeping ladar algorithm, the motor is allowed to execute one sweep of the ladar from up to

down. The entire data obtained from the ladar during this period is stored in an array. Using the

calculations mentioned above, the total number of scans of the ladar is then matched with 0.5

degree positions of the sweep angle. One sweep of the ladar by the motor at 15deg/s takes

approximately 5s. Synchronization of the ladar and sweep angle consumes a longer time. That is,

if the motor was steeped to move to 0.5 degree increments and then allowed to acquire a full

ladar scan at each angle, the time consumed to execute one full sweep would be much higher.

The overall allocation of scans to sweep angles eliminates the need to synchronize the sweep

angle positions with the ladar scan and thereby saves time. Also, it was experimentally observed

that the error introduced due to 0.5 degree allocation of scans is not significant to warrant

synchronization.

As mentioned earlier, the data obtained from the ladar is in polar coordinates with the ladar

being the origin. For this algorithm, it is intuitive to use data in Cartesian coordinates. Consider

the coordinate system shown in the Figure 3-76.

Let P be a data point at a radius R from the origin subtending angles m and n with the z

and y axis respectively. To convert this to the Cartesian coordinate system, the following

equations were used.


Px = -R Cos(-) Cos(m) (Eq. 3-41)










Py = R Sin( ) Cos(m)


Pz = R Sin(-) Sin(m)


(Eq. 3-42)


(Eq. 3-43)


, m, n)


Figure 3-76. Coordinate systems

(Px, Py, Pz) is the data point in the Cartesian coordinate system. The data obtained by the

scan is then filtered to keep the relevant data. It is known that the lateral distance of trees in the

row, from the ladar can range from a distance of 0.5m to a maximum of 5m. The ladar to the

vehicle's end is 0.5m. So, a tree limb is not expected to be closer than 0.5m during normal

operation. Distances longer than 5m correspond to trees located in adj acent rows. Therefore, the

filter was implemented such that any data beyond 5m and data less than 0.5m were removed.

After filtering, a sample plot of the data in the Cartesian coordinates is shown in the Figure 3-77.
































Fiur 37. Swp aa at nCreia oriae

Iti bsre fo hefgreta tedtacresodngt ifeet re aecusee








cotibute v-7.Sery fewa data pint Catobe clarlydetected.






An algorithm was developed to determine the tree locations from the data. The Figure 3-77

shows the data in 3-dimensional space. The search space for the algorithm, when analyzing data

in 3-dimensions is large. This results in the consumption of a large amount of computer

processing time. For real-time precise headland detection, this is significant. The aim is to locate

the trees with respect to the vehicle in the x-y plane. The location of the data in the vertical plane

is not important. Without loss of useful information, the 3-dimensional data is proj ected on to the

horizontal plane. The proj ected data is shown in the Figure 3-78.











-500 -

-400 tn-------------------!---- --!------!-------











1 & I I I



50 00 5020020030


Figure 3-78. 3-dimensional ladar data proj ected on the horizontal plane

It should be noted that multiple points at a location with different z values are overlapped

in the Figure 3-78. The data space consists of Icm xlcm bins. To locate the location of the trees,

a clustering algorithm is used. The clustering algorithm is as follows:

* Scan the data space using a 50cm x 50cm window in 50cm increments

* If the window in a location consists of more than 5 data points, note the location of the
window

* The location indicates the presence of a tree or a part of a tree

If two consecutive tree containing locations differ by more than 250cm, the difference is

due to the presence of headland. It is then determined that the headland starts just after the last

tree location prior to this gap. Based on the location of the trees in the alleyway, the exact

location of the headland is determined. The algorithm and clustering ensure that the headland is





144









found precisely and the determination is accurate to less than 50cm beyond the last tree in the

alleyway. The result of clustering and headland determination is shown in the Figure 3-79.














Figure 3-79. Tree clustering and headland determination

Vision and ladar based headland determination. The vision based headland detection

and sweeping ladar based establishment of headland were described above. As mentioned, the

sweeping ladar process takes approximately 5s to execute a sweep. It is not practical to execute

the sweep while navigating the entire row. Therefore, sweep is started only when the vision

based algorithm detects the headland. Then the sweeping ladar algorithm takes over. The dead

reckoning is no longer required. On further tests conducted in the grove using the above

algorithms, it was also found that a stationary ladar mounted at an angle of 10 degree to the

horizontal, is able to detect four trees similar to the sweeping ladar. This mounting angle

increases the operating range of the ladar. This angle however, may not be suitable for close

range obstacle detection. The stationary mount of the ladar also obtains less data compared to the

sweeping ladar. This however does not prevent the precise determination of the headland and is

as accurate as the sweeping ladar. The sweeping ladar is still a useful development for future

operations in this proj ect. Once the headland location is established, the vehicle speed is reduced

to headland navigation speed.










Navigating the headland and entering next row. From the location of the headland

determined by the vision algorithm and the exact location determined by the sweeping ladar, the

turning maneuvers are started when the last tree in the alleyway has been passed. Two turning

maneuvers namely the u-tumn and the switch back turn were developed. The determination of

which maneuver is to be executed is made using the sweeping ladar algorithm. If an open space

of more than 3m is available in the headland, the vehicle proceeds to do a u-tumn and switch back

turn otherwise. For the turning maneuvers, algorithms for steering similar to human operations

were developed. For the development of the turning algorithms, the distance between the left

and right tree stems in a row is assumed to be approximately 6m. The U-tumn maneuver is

implemented as follows:

1. When the vehicle has crossed the last tree, note the heading of the vehicle using the yaw
angle from the IMU.

2. Tumn the steering completely to one end such that the vehicle moves opposite to the direction
of the required tumn. This step is to make a wide u-turn.

3. Move forward for approximately 0.5m.

4. Tumn the steering completely to the other end such that the vehicle turns in the required
direction. Allow the vehicle to move forward.

5. Monitor the heading angle. Monitor for obstacles in front of the vehicle using the ladar. If an
obstacle is detected, stop the vehicle.

6. When the vehicle has turned by more than 150 degrees, switch to alleyway navigation mode
using vision and ladar.

7. The vehicle has now just entered the new row. Any error in entering the new row is corrected
using the alleyway navigation algorithm.

The switch-back turning maneuver is implemented as follows:

1. When the vehicle has crossed the last alleyway tree, note the heading of the vehicle using the
yaw angle from the IMU.

2. Move forward a distance of 0.5m. This is done to avoid colliding with any long branches of
the last alleyway tree.










3. Tumn the steering wheel to one end such that vehicle turns in the required direction.

4. Monitor the vehicle's heading and monitor for ob stacles in front of the vehicle using the
ladar.

5. When the vehicle has turned by an angle of 90 degrees with respect to the initial heading
angle before turning, stop the forward motion of the vehicle.

6. Tumn the steering to center position and reverse the vehicle a distance of 0.5m.

7. Tumn the steering to the end such that the vehicle turns in the required direction.

8. When the vehicle has turned by more than 150 degrees with respect to the initial heading
angle before turning, switch the algorithm to alleyway navigation mode using vision and
ladar.

9. Any lateral error in entering the new row is corrected using the alleyway navigation
algorithm.

It is to be noted that the turning maneuver algorithms are tuned for a grove with certain

dimensions of alleyway width. It is operable in groves with dimensions that are different by

approximately Im. But the algorithm is not suitable for groves that differ more. It is difficult to

navigate headlands using vision and ladar as headlands can consist of random obj ects and the

vision and ladar could not be modified for all possibilities. During tests in a grove in Immokalee,

Florida, it was found that the grove includes a large drainage pit on the ground at the end of each

row. In such a case, the turning maneuvers failed as the vehicle ends in the pit. Therefore, more

complex ladar based algorithms may be necessary to adapt the vehicle to such possibilities.

Open Field Navigation and Obstacle Detection

In the previous sections, methods were described that allowed the autonomous vehicle to

accurately navigate the alleyways of a citrus grove, detect headlands and tumn at the headlands

using u-turn and switch-back turn maneuvers and navigate subsequent alleyways. With these

capabilities, the autonomous vehicle possesses the ability to navigate an entire grove

autonomously. In large citrus producing farms, the groves can be arranged in the form of blocks

separated by open fields. With the previously described capabilities, after the vehicle completes










navigating a grove block, the vehicle has to be manually driven to another grove block. The

vehicle would then autonomously navigate the new grove block. If the vehicle could

autonomously navigate the area between grove blocks, the entire farm could be covered without

human intervention.

When the vehicle navigates the citrus grove alleyways or open fields, obstacles such as

animals, humans or broken branches can be expected. When obstacles are present, the vehicle

must be capable of detecting them and stopping the vehicle. If possible, the vehicle should also

navigate around the obstacle. Within the grove alleyway, there is not much space to navigate

around obstacles, whereas it might be possible in the open fields. The development of obstacle

avoidance capability is beyond the scope of this proj ect. This chapter discusses the development

of open field navigation capability to navigate the open field between grove blocks and the

development of obstacle detection capability.

Open field navigation

Open field refers to the open area available between grove blocks. The open field is

expected to contain uneven terrain, random placement of trees and fences, and presence of

obj ects such as cows. The vision based navigation and ladar based navigation methods described

earlier were developed such that they could segment trees from the ground and navigate between

trees. In open fields, there is no fixed placement of obj ects. The random nature of obj ects in open

fields or lack of objects makes it difficult to develop vision or ladar based algorithms for

navigating open fields and reach a different grove block. A global solution is required to solve

this problem. GPS is an excellent global method for navigating open fields. Therefore, GPS

based navigation for traversing open fields is proposed.

A differential GPS (DGPS) receiver was available in the laboratory from a previous

research on determining the tractor dynamics and was described earlier. GPS based navigation









has been widely used by several researches for guiding autonomous vehicles through crop rows.

The DGPS receiver available has a manufacturer specified accuracy of approximately 12inches.

The receiver is also capable of operation at 5Hz with a DGPS fix i.e. enough satellite signals and

the local station correction signal is available for high accuracy operation. For navigation based

on GPS, the path that the vehicle must take should be specified as way points apriori. Using

these points, the vehicle, with the aid of the DGPS receiver, can follow the points to reach the

next grove block. The way points for this purpose would be the latitude and longitude

coordinates of way points obtained using the DGPS receiver. When operating autonomously, the

guidance system must be capable of using the latitude and longitude information to steer the

vehicle to the required way point.

The important component of the algorithm consists of determining the steering angle of

the vehicle to get the vehicle from the current location to a required location. The capability to

steer to consecutive points can be implemented in the software. Consider the vehicle located at a

point A with latitude 'latA' and longitude 'lonA'. Similarly, consider a way point B with

coordinates 'latB' and 'lonB'. Suppose the vehicle is required to travel to B from A, it is

necessary to determine the distance between the locations and the direction of location B with

respect to location A. The direction can be specified with the geographic north-south direction as

reference. To determine these, a navigation formula called Haversine formula, was used. The

Haversine formula is an equation to determine great-circle distance between two points on a

sphere. It is an adaptation of the more general spherical trigonometric formulas. To use the

Haversine formula, considering the earth as a sphere, assumptions are important to obtain results

with good accuracy. For example, when the points are situated at a large distance compared to

the radius of curvature of the earth, the formula considering the earth as a sphere gives good










accuracy; when the distance between the points is significantly small compared to the radius of

the earth, the assumption of a flat earth can give accurate results and reduce computation; when

the points are located on either side of hills or mountains, modifications have to be made to

obtain accurate results. For this proj ect, it is assumed that the way points would be located at a

much smaller distance compared to the radius of the earth and no significant elevation different

is present between the points. Therefore, the Haversine formula, modified for short distances was

used. The Figure 3-80 shows the vehicle at A and a way point B where the vehicle must reach.


Way point
*B


Figure 3-80. Illustration of vehicle navigating way points

The vehicle is heading in a direction towards a random point C with a heading angle m

with respect to the geographic north. The straight line distance required to reach B is 'd' and a

turn through an angle 'theta+m' is required to reach B. The Haversine formula gives d and theta


d =Rp


(Eq. 3-44)


where R = radius of curvature of earth = 6378140m










p = 2 Tan' ( ,L- J(- )


(Eq. 3-45)


dlat dlon
a = Sin2( ) + COs(latA) Cos(latB) Sin2( )(Eq. 3-46)
2 2



dlat = latB latA (Eq. 3-47)



dlon = lonB lonA (Eq. 3-48)
theta = mod [atan2 { Sinrdlon) Cnos/latB), Cos/latA) Sin/latB) Sin/latA) CoslatB)


Cos(dlon)), 2 xi ] (Eq. 3-49)

theta is positive for directions east of the geographic north and increases anti-clockwise

from 0 to 2 xi in radians. The direction 'theta' is the heading from location A to location B. To

steer the vehicle, it is also necessary to determine the current heading of the vehicle with respect

to the geographic north. This information was obtained from the IMU mounted on the vehicle for

other algorithms. The IMU gives the heading of the vehicle with respect to the geographic north

as 'm'.

The vehicle should be steered such that the angle 'm' is the same as the angle 'theta'. It is

to be noted that the DGPS receiver is operating at a rate of 5Hz and the IMU is operating at a

rate of approximately 40Hz. An assumption in developing the algorithm is that the vehicle is

operating at a speed of less than 4m/s. This speed limitation is based on the 5Hz refresh rate of

the DGPS receiver and was determined experimentally by trial and error. As the vehicle is

steered in the proper direction, the distance between A and B also reduces until the vehicle

reaches the location B.









The steering angle obtained by the above formulas is in radians. In the development of the

pure pursuit steering described earlier, a protocol was used such that the PC sent the error of the

vehicle in pixel coordinates to the PC104 computer through serial communication and the error

was converted to steering angle by the pure pursuit algorithm in the PC104 computer. Therefore,

to use the same protocol, the steering angle obtained above has to be converted to pixel

coordinates before sending to the Pure Pursuit steering controller. The physical range of the

steering angles of the vehicle is approximately -30 degrees to +30 degree from left to right

respectively with 0 degree as the center.

From the above equations, the steering angle 'theta+m' has a range of 0 to~r It should be

noted that the mathematical calculations can provide a range of 0 to 2 xi Considering the

examples that a turn of 2 xi is the same as 0 and for example a turn of 1.5 xi is the same as turn of

0.5 x in the opposite direction, the effective range can be reduced to 0 to ai for steering. In

degrees, this is the same as 0 to 180 degrees. This can also be expressed as -90 degree to +90

degree.

Comparing the vehicle's physical steering angle range and the required steering angle

range, the conversion is from 0 to 180 degrees with 90 as center to 0 to 60 degrees with 30 as

center (Figure 3-81).

A linear relationship is formed between the ranges such that -30 to +30 degrees of the

required vehicle turn is mapped to -30 degrees to +30 degrees of the physical steering angle of

the vehicle. If this one to one relationship is made, the range from -3 1 degrees to -90 degrees if

the vehicle' s turn and +3 1 degrees to +90 degrees of the vehicle' s turn should also be mapped.

The range of -90 degree to -31Idegree of the vehicle turn is mapped to a single value of -30










degree of steering angle and the range +31Idegree to +90degree of the vehicle turn is mapped to a

single value of +30 degrees of steering angle (Figure 3-81).









-30~ s +30


-90 Turn of vehicle +90







Figure 3-81. Angle ranges of steering and vehicle turn

From the previous chapters, it is known that the image width used for vision is 400

pixels. To maintain the communication protocol with the PC104 computer algorithm, the range

of -30 degrees to +30 degrees is mapped to 0 to 400 pixels of image coordinates.

Pixel value = 400 (steering angle/60) (Eq. 3-50)

This pixel value of the required steering is sent to the Pure Pursuit steering control.

As described in the algorithm above, when suitable steering is executed, the distance

between the vehicle and the way point reduces and the vehicle reaches the way point. It is known

that the accuracy of the DGPS receiver is 12 inches. During initial developments of the

algorithm, the vehicle was determined to have reached the way point, when the distance between

the vehicle and the way point was zero. It was observed that the uncertainties in the accuracy of

the receiver, the terrain variation and small inaccuracies in steering often result in the vehicle

reaching a point very close to the way point of the order of centimeters away from the way point.










In such cases, if the algorithm requires that the distance be zero, to translate the vehicle by a few

centimeters to the way point, the vehicle has to execute circles or spirals to get to the way point

(Figure 3-82).



Path



r ranslate

| way~ Point

I Vehicle






Figure 3-82. Path taken for translation of the vehicle to the way point

An accuracy of exactly reaching the way point is not practical in the physical system.

Therefore, by trial and error in tests, it was chosen that if the vehicle reaches a point within a

distance of 50cm from the way point, the way point is determined to have been reached by the

vehicle.

Obstacle detection

The obstacle detection capability should allow the vehicle to detect obstacles such as

humans, animals and location variable obj ects such as vehicles, while navigating open fields. For

the scope of this dissertation, a simple obstacle detection method using ladar was developed. It is

assumed that the way points for GPS based navigation were chosen such that the vehicle would

not be expected to get tree limbs or fixed obj ects such as thin poles in its path.

For the development of the algorithm for detecting obstacles, the first step was to analyze

the data from the ladar while navigating the open field. As mentioned earlier, the ladar scans a










plane from 0 degree to 180 degrees and obtains radial distance to obj ects in front of the ladar.

The radial data is converted to Cartesian coordinates using the geometrical formula

Frontal distance = (radial distance)(Sin (scan angle)) (Eq. 3-51)

Lateral distance = (radial distance)(Cos (scan angle)) (Eq. 3-52)

The same formula was also used in the algorithm for ladar based grove alleyway

navigation. The Figure 3-83 shows the plot of the frontal distance obtained from the ladar against

the scan angle when there is no obstacle in front of the vehicle.






1500 G o n




0 0 4 0 80 10 10 10 6 8 0
Scn nge de.











then vehicle.g.

TeFigure 3-84. shows th ca at rm hldr when n obstacle is o present directly te ehcl

infoto h eil.From the datat it is obsrve that h lst ian n formost of the angiles athe0 ldagrdtai




identical andl f9 dges thisitac corresponds to opnfed.Hwvr, t cnd langles of 85derees to 9 dnegrees,





































Obsacl


the frontal distance drops down to approximately 450cm from the ground data which was


approximately 6m in the previous scan shown above. This large change in data corresponds to


the presence of an obstacle in front of the vehicle. This can also be interpreted as the obstacle


cutting the laser to the ground, thereby showing a smaller distance (Figure 3-85).


2000
E
U
8
S
ti 1500
o
I
rr
1000



500


0 20 40 60 80 100 120 140 160 180 200
Scan Angle (deg.)

Figure 3-84. Ladar scan data with obstacle present in front of the vehicle


No Obstacle Obstacle


6m


Ground


Figure 3-85. Illustration of the obstacle detection using ladar


The obstacle detection algorithm is based on the above analysis of data described above.


The algorithm is as follows:


*Prior to operation in autonomous mode, when the vehicle is stationary, obtain the ladar
scan radial distance corresponding to 90 degree. Determine this distance as the ground
distance from the ladar.










* Operate the vehicle in autonomous open field navigation mode.

* Monitor the ladar data continuously and note sudden drops of more than 50cm from the
ground distance.

* If a drop is encountered at an angle, slide a data window of length 5 data points through the
next 5 subsequent data.

* If all of these data are also less than 50cm from the ground distance, determine the
presence of an ob stacle.

* If an obstacle is determined, stop the vehicle.

* Continue monitoring the scan data for the presence of obstacle.

* If no obstacle is found, continue navigating using GPS based navigation.

The stop signal is sent to the PC104 computer as a special stop command to reduce the

speed to zero. No change is made to the steering control. The algorithm uses a data window of 5

data points to scan through the ladar data set to detect obstacles. This corresponds to an angle of

3 degrees. It was determined experimentally by trial and error that an angle of 3 degrees

translates to sufficient width to determine the presence of an obstacle such as a human. Wider

obstacles such as cows are expected to be determined. The threshold distance of 50cm was

chosen so that small terrain elevation changes are not detected as obstacles. The angular width of

3 degrees may not be sufficient to detect obj ects such as poles. This however is useful in

preventing the detection of tall grass as obstacle. The flowchart of the algorithm is shown in the

Figure 3-86.


















































Figure 3-86. Flowchart of the obstacle detection algorithm









CHAPTER 4
SOFTWARE DEVELOPMENT

Introduction

This chapter discusses the software development part of the research. The architecture of

the software and the interaction between the computers is discussed. The high level programs

implementing the guidance algorithms were written in the Microsoft Visual Studio.Net 2003

environment using the Microsoft Foundation Class (MFC) for the Graphical User Interface

(GUI). The environment of the low level software will be discussed in the respective sections.

Sections of the code relevant to the discussion are presented in the appendices and are referenced

in this chapter.

Software Architecture for Tractor Guidance

The hardware architecture and the implementation of the guidance system developed for

the tractor was discussed in Chapter 3. The high level algorithms, primarily machine vision

based path segmentation and ladar based path segmentation were implemented in C++ in the

Windows based PC shoeboxx). The low level control algorithm, that is, the PID based steering

control was implemented in the microcontroller. The software architecture of the tractor based

guidance system is shown in the Figure 4-1. In addition to the vision and ladar based algorithms

in the shoebox computer, other software for serial communication, camera control capability and

GUI were also implemented. The high level programs from the shoebox communicated with the

low level control in the microcontroller using RS232 serial communication. The vehicle's lateral

error in the path was communicated to the low level controller for steering. The low level

controller did not have an exclusive display monitor. Therefore the data was communicated to

the shoebox using RS-232 for display in the monitor and for the operator to monitor the

operation of the steering controller.











Shoebox


Microcontroller


RS 232


Figure 4-1. Software architecture of the tractor guidance system

The main application software was started up using dialog box based control in the

shoebox computer. When the software is started, a dialog box window opens as shown in Figure

4-2.


File Deace, ,


GPS
LADAR
Accelerometer
Speedometer r


Video
Display


Figure 4-2. Dialog box for high level controls

The dialog box consists of a large window for displaying the video seen using the camera.

There is a menu bar on the top left of the window for accessing the various sensors. The menu

for each sensor includes an additional level of menu for calling each function controlling the

individual sensor. Using the dialog box, each required sensor is started up. The guidance










algorithm starts sending the path error information to the microcontroller as soon as either vision

or ladar algorithm is started up in the dialog box menu. The low level control program in the

microcontroller has to be started using the custom C++ compiler provided with the

microcontroller. As soon as the low level controller is turned on, it starts to poll for error from

the PC and send steer commands to the electro-hydraulic valve. The voltage line between the

microcontroller and the electro-hydraulic valve also has a hardware switch. It should be noted

that the tractor is switched to autonomous mode only when all the required algorithms have been

started up. Stopping the programs is also performed using the dialog box.

PID Based Steering Control

The development of the PID based steering control system was discussed in Chapter 3. The

steering control was implemented in the Tern Microcontroller. The software used was the custom

C++ development environment provided with the microcontroller. The microcontroller was

interfaced with the PC through RS-232 serial communication. The input received by the PID

controller was the lateral position error of the vehicle in the path. The output of the controller is

the voltage sent from the Analog to digital converter in the microcontroller. This voltage in turn

is sent to the electro-hydraulic valve through the amplifier. The important sections of this code

are presented in Appendix A. The flow of the program is as follows:

1. The microcontroller is first initialized. Then, the initialization is performed for the data, the
decoder, the serial communication from the PC, the analog output from the Analog to digital
card, and the PID control variables. The steering is also centered initially.

2. An infinite while loop is started. The exit from the while loop is based on receiving serial
command from the PC.

3. The initial decoder value is obtained corresponding to the center of the steering.

4. Serial data is obtained from the high level vision or ladar algorithms running in the PC. The
data is obtained in the form of four consecutive characters along with a start character.









5. Once the correct start character is obtained, the other 4 characters are analyzed. The first
character provides information about the turn being left or right.

6. The three subsequent characters specify the error information. The three characters are the
multipliers to the 100th place, the 10th place and the 1 place. If the characters obtained are a,b,
and c, then the error is 100a+10b+c. Once this reconstruction is performed, the PID control is
started.

7. In the PID control, the error obtained is used to calculate the derivative of the error and the
integral of the error.

8. From these calculations, the gain is calculated as

PID gain G = Kp(error) + Kd(change in error)+Ki(integral of the error) (Eq. 4-1)
where Kp, Kd and Ki are the individual gains mentioned in Chapter 3.

9. This overall gain G is used to obtain the required steering angle change using an
experimentally determined formula to convert the gain to required change in encoder pulses


Change in pulses = (0.5) Gain (7) (Eq. 4-2)
Where, (7) corresponds to the encoder calibration that 7 pulses correspond to 1
degree turn of steering.

10. This required turn of the steering in the form of encoder pulses is used to turn the steering.
For the steering, the required turn angle is converted to the required voltage to execute the
turn at a reasonable speed.

1 1. The voltage value is sent to the analog to digital converter of the microcontroller.

12. Apart from these calculations, saturation limits are used to limit the voltages and steering to
physical limitations of the hardware.

13. When the navigation is complete, the steering is centered and the execution is terminated.
The manual control of steering is again available.

Vision and Ladar Based Path Segmentation

As mentioned earlier, the vision and ladar based path segmentation algorithms are

implemented in the shoebox computer. The important code segments for these are presented in

Appendix B and Appendix C. In the initial stages of the proj ect, when the vision and ladar

sensors were used independently for navigation, only one of these codes was used for navigation

at a given time. This choice was made by the human operator as to which function was called.









The vision algorithm is implemented in the function Imagproc(). The function is called from the

OnPaint function of Windows. The OnPaint is a screen refresh function. The vision algorithm is

repeatedly called as often as Windows tries to refresh the screen. Therefore, the refresh rate of

the image in the dialog box is dependent on the speed of the vision algorithm. The input to the

vision algorithm is the pointer to the memory location where the image is stored. The output is

the lateral error of the vehicle in the path. The vision program is implemented as follows:

1. The imagproc() function is called. The required variables are initialized.

2. Thresholding is performed for each pixel of the image using the specified threshold values.
Binary image is formed.

3. Regions of the image are scanned using a data window and determination is made as to
whether the image is of a case with lot of shadows or not. If it is a non-shadow image,
thresholding is performed again using new threshold values.

4. Morphological operations are performed to segment the image further and remove salt and
pepper noise left from thresholding.

5. The image is scanned row by row to determine the location of the boundary of path and non-
path regions. The pixel locations are stored in an array. These locations correspond to the
path boundaries.

6. This array is sent to the line fitting function GetBoundary(left) and GetBoundary(right) to fit
lines for the path boundaries.

7. The lines are used to determine the center of the path GetBoundary(middle).

8. The required path to be taken is painted in the image.

9. Using the center of the path and using the center of the image as the vehicle center, the lateral
error of the vehicle in the path is determined. This error is in pixel coordinates.

10. The error is sent to the PID based steering controller.

The ladar based path segmentation is implemented using the function LadarGuidance().

The input to the function is the pointer to the memory location where the ladar data is stored. The

output is the vehicle lateral error in pixel coordinates. The function is called from the function

used to obtain the ladar data. Therefore, the Ladar Guidance function is called as often as the









ladar data is obtained and the ladar algorithm is run in a sequence. The implementation is as

follows:

1. The ladar data in polar coordinates is converted to Cartesian coordinates to determine lateral
and frontal distance to different obj ects. Only frontal distance is used to determine the path.
The data is an array of 360 numbers, i.e., O to 180 degrees at 0.5 degree increments.

2. The vehicle is initially stationary. The distance value at the 180th data or 90 degrees is saved
as the distance of the ladar to the ground.

3. The frontal distance data is scanned from 0 to 180 degrees. If a data is less than 50cm from
the ground distance, it is determined as an obj ect with some height(~50cm) above the
ground. In the grove alleyway, an obj ect of that height is determined as a part of a tree.

4. Five successive points are then confirmed to possess an equal or greater height to ensure that
the originally determined point was not due to noise or stray grass.

5. The location of the closest tree point is determined on the left and right side of the vehicle' s
center. The lateral distance corresponding to these locations is determined.

6. If the lateral distance is over 3m, it is determined to be a tree from a subsequent alleyway.

7. From the lateral distances of these tree points on the left and right side of the vehicle, the
required center point of the path is determined. The lateral distance to this center point is the
error of the vehicle in the path.

8. The error is sent to the PID based steering controller. As the steering controller takes pixel
coordinate error as input, the error determined from the ladar is converted to pixel
coordinates using a formula

Lateral error in pixels = 200-0.74*l ateral_error (Eq. 4-3)

This formula is experimentally determined from trial and error to correspond to an
equivalent pixel error.

9. If trees are present only on one side of the vehicle, path center is determined at a fixed
distance from the tree on one side. If trees are not present on either side, zero error is reported
till a tree is encountered.

Sensor Fusion Based Navigation

The development of the Kalman filter based sensor fusion method was described in the

Chapter 3. The code segment for the Kalman filter implementation is presented in Appendix D.

The sensor fusion is implemented in the function Fusion(). The inputs are the measurement










values from vision, ladar, yaw from the IMU and the speed from the speed sensor. The output of

the function is the fused error in pixel coordinates. The variables are intuitively named to be

similar to the variable names used in the algorithm development. The function is called from the

function OnPaint() when both vision and ladar or at least one of them is operational. The fuzzy

logic supervisor and divergence detector as implemented as a set of if then loops identical to the

explanation in the Chapter 3. The function is implemented as follows:

1. The variables are initialized. The measurement values for each time step are taken as input to
the measurement vector. The matrices are implemented as two dimensional vectors. The
vectors are implemented as arrays. If this function is called for the first time, the yaw angle is
taken as the initial heading angle of the vehicle. The process and measurement noise
covariance matrices are initialized.

2. The vision and ladar data are checked for the range. If the range is unreasonable, the
measurement noise covariance value is increased. This is performed using fuzzy logic if then
loops similar to the description in the Chapter 3.

3. The time update equations 3-17 and 3-18 are first executed to compute the state estimate and
the state estimate error covariance matrix.

4. The measurement update equation 3-20 is executed to compute the gain. The equations 3-21
and 3-22 are then executed to compute the posterior estimate of the state and the posterior
estimate of the state estimate error covariance matrix.

5. The fused error is obtained from the posterior estimate of the state. This error in pixel
coordinates is sent to the low level controller for steering.

6. The innovation vector is computed using equation 3-31. This is used as the input for
detecting divergence.

7. The process noise covariance matrix is updated if divergence is detected.

Large amount of matrix memory is used for this implementation. Therefore, the memory

allocation is made before starting the fusion routine and deleted before exiting the program.

Software Architecture for eGator Guidance

The hardware architecture of the guidance system for the eGator was presented in Chapter

3. The high level algorithms are implemented in a PC running Windows XP operating system.










The low level algorithms are implemented in a PC 104 based computer using Linux (SUSE 10)

operating system.


Windows LnxCAN
Error IPure Pursuit
Ladar Ero I Steering

Speed
Kalman Headland ISpeed I ,Analog
Filter NaiainControlOu
SensorRS3
Fusion


Vision



Figure 4-3. Software architecture of the eGator guidance system

The software architecture of the guidance system used with the eGator is shown in the

Figure 4-3. The lateral error of the vehicle and the required speed of the vehicle is sent from the

high level computer to the low level computer using RS232 serial communication. The low level

controller outputs the steering angle as CAN commands and the required command for the

analog to digital card to output the voltage to control speed. The low level controller also

monitors a serial port for information from the vehicle's wheel encoder to perform dead

reckoning and speed control. The high level computer and the software is the same as the one

used with the tractor. Therefore, the high level software is identical in operation using dialog box

based control. The low level software is implemented in C++ using the GNU C++ compiler. The

low level software execution is started from the command prompt. The low level software is then

turned off using the commands from the high level controller. For monitoring the activities of the

low level software, informational text strings are printed in the terminal window. A hardware

switch is included between the low level computer and the drive motor controller. As soon as the

low level software is started, it polls the serial port for error information from the high level










computer. Therefore, the hardware switch should be turned to autonomous mode only when both

the high level and low level software are operational.

Pure Pursuit Steering

The algorithm for the Pure Pursuit steering was presented in Chapter 3. The code segment

for the software implementation is presented in Appendix E. The algorithm is operational as

soon as the low level software is started. The input to the algorithm is the lateral error in pixel

coordinates from the high level controller. The output of the algorithm is the required steering

angle. The software implementation is as follows:

1. The lateral error of the vehicle is obtained in pixel coordinates.

2. The error is then determined to correspond to either a left turn or a right turn.

3. The error in pixel coordinates is converted to distance coordinates. The look-ahead distance
is chosen as 6m. At 6m, a calibration was made that 180 pixels in the image corresponds to
2.43m of lateral distance. This calibration is used to convert error in pixels to meters.

4. The error is used to compute the required radius of curvature of the vehicle using the
equation 3-35.

5. The steering angle is computed from the radius of curvature using equation 3-38 and 3-39.

6. The steering angle is sent to the CAN controller using CAN commands.

Headland Turning

The development of headland turning algorithms is discussed in Chapter 3. The algorithms

are implemented in the PC in the Visual C++ environment. The important code segments are

presented in Appendix F. This software is implemented as functions called from the vision based

alleyway navigation function, when the headland approach is determined using vision. When

approaching the headland, first the sweeping ladar algorithm is called to determine the exact

location of the headland. The sweeping ladar algorithm is implemented in the function

treedladarprocess(). The steps implemented in the function are as follows:










1. The ladar data is converted from 3d polar coordinates to 3d rectangular coordinates using
equations 3.41, 3.42 and 3.43.

2. The relevant data is cropped to eliminate unreasonable distances in the x,y, and z directions.

3. To reduce computation time, the data in the 3d Cartesian space is projected on the 2d
Cartesian plane.

4. The data in the 2d is treated similar to an image to determine the locations of the trees.

5. From the location of the trees, the headland location is precisely determined.

When the vehicle is at the end of the alleyway, the turning functions are called. These are

implemented in two functions HeadlandU() for the u-turn and Headland3pt() for the switch-back

turn. The choice of whether u-turn or switch-back turn is performed is specified in the program

by the user. The input to the two functions is the yaw angle from the IMU and the output is the

steering command to the low level controller specified as error in pixel coordinates. The

implementation of the turns is based on the algorithm development presented in Chapter 3. The

u-tumn is implemented as follows:

1. The machine vision algorithm is turned off during the headland turn. Initialization of
variables is performed. The angle at which the vehicle exits the alleyway is saved. The tumn
will be performed such that the vehicle tumn is referenced to this saved angle.

2. Corrections are then calculated. This can be explained using an example. It is to be noted that
the IMU outputs yaw angles with a range of 0 to 360 degrees in the clockwise direction. If
the vehicle exits the alleyway at an angle of say 30 degrees and if an anti-clockwise u-turn is
to be performed, the Einal angle output angle from the IMU after the tumn is performed would
be 210 degrees, whereas the mathematical calculation in software would be 30-180 or -150
degrees. To avoid this difference due to the cyclic nature of the data, the appropriate
corrections are calculated.

3. If a left u-turn is required, as soon as the vehicle exits the alleyway, a right turn is made until
the vehicle turns by a specified angle. Note that the vehicle also moves forward while
turning. This tumn in the opposite direction is done to execute a wide u-tumn.

4. Then, the left turn command is given. The yaw angle is continuously monitored. When the
yaw angle is close to 180 degree with reference to the original saved angle as reference, the
control is switched to vision and ladar based alleyway navigation functions. The alleyway
navigation functions make the vehicle enter the alleyway and continue navigating the next
alleyway .









The switch-back turn is implemented similar to the u-turn function. The difference is that

the initial turn in the opposite direction is not made and after the vehicle has turned by 90

degrees with reference to the initial saved angle, the vehicle is reversed for a specified distance

before the turn is continued. The 'Gator.Write' command is for sending the data to the low level

controller using serial communication.

Open Field Navigation and Obstacle Detection

The development of the open field navigation and obstacle detection algorithms were

discussed in the Chapter 3. The algorithms are implemented in the PC in the Visual C++

environment. The open field navigation uses a GPS based navigation algorithm for steering the

vehicle whereas the obstacle detection algorithm is based on the ladar and is used to stop the

vehicle when an obstacle is present. The important sections of the code are presented in

Appendix G.

Open Field Navigation

In the GPS based navigation, the algorithm determines how much the vehicle is to be

steered to reach a way point. The Pure Pursuit steering discussed earlier uses error information in

pixel coordinates to compute the steering angle of the vehicle. Therefore, the steering determined

by the GPS based navigation algorithm converts the required steering to error information and

passes on to the Pure Pursuit steering algorithm in the PC104 computer. The software is

implemented as follows:

1. The way point file is opened and read to acquire the latitude and longitude coordinates of the
way points. The values are stored in an array. The file is then closed. This is implemented in
the function ReadGpsCoord().

2. To navigate the way points, the function GPSnavigate() is called. The function includes a
loop which runs till all the way points have been reached.

3. If all the points have not been reached, the distance to the way point, 'GPSdistance', and the
required heading of the vehicle 'GPSbearing' are first calculated.










4. The IMU is mounted such that the device points towards the geographic south. As mentioned
earlier, the GPS based navigation algorithm provides the heading with reference to the
geographic north. Therefore to use the same reference for both, the required heading
calculated using the GPS coordinates is shifted by 180 degrees.

5. The heading required of the vehicle, calculated using the GPS coordinates, and the current
heading of the vehicle are compared to determine the direction of steering and the amount of
steering. This is implemented in a series of if-then conditions.

6. The calculated steering angle is then converted to error in pixel coordinates. The data is
filtered using median filter to eliminate noise.

7. The pixel error is sent to the PC104 computer running the Pure Pursuit steering algorithm.

Obstacle Detection

In the obstacle detection algorithm, whenever an obstacle is detected, a unique command is

sent to the PC104 computer to stop the vehicle. The obstacle detection algorithm is implemented

as follows:

1. The ob stacle detection is implemented in the function LadarOb stacleDetection().
Initializations are first performed.

2. The ladar data is obtained in the form of scan angles and radial distance SIC.MVA[i]'. The
radial distance is then converted to frontal distance.

3. When the vehicle is stationary initially, the frontal distance corresponding to the 90 degree
scan angle is saved as the distance to the ground in front of the vehicle.

4. The data is scanned from 70 degrees to 110 degrees to look for data points less than 50cm
from the ground distance. If such a data point is found and if five consecutive points are also
less than 50cm from the ground distance, then the presence of an obstacle is determined. The
range from 70 degrees to 1 10 degrees spans 40 degrees and covers the area just in front of the
vehicle.

5. If an obstacle is found, stop command is sent to the PC104 computer to reduce the speed to
zero.

6. The function is called repeatedly as long as the GPS based navigation is operational.
Therefore, if the obstacle is removed, the stop signal is not sent and the vehicle continues to
move forward.










CHAPTER 5
DEVELOPMENT OF MACHINE VISION AND LASER RADAR BASED AUTONOMOUS
VEHICLE GUIDANCE SYSTEMS FOR CITRUS GROVE NAVIGATION

Introduction

Florida supplies over 80 percent of United States' supply of citrus. Over the years, the

labor supply for citrus harvesting has declined. Immigration restrictions have limited the

movement of migrant workers. The citrus industry is also facing increased competition from

overseas markets. As a result, the need for automation is being felt in the citrus industry. An

important part of the automation process is vehicle guidance. Sample autonomous vehicle

applications may include harvesting, disease or nutrient deficiency monitoring, mowing,

spraying and other tasks.

There are numerous autonomous and tele-operated vehicle systems described in the

literature. Tele-operation has been used for guiding a HST drive vehicle by Murakami et al.

(2004). The major challenges encountered in tele-operation are the time delay in communication

and full time attention of a human. Yekutieli and Pegna (2002) used a sensing arm, to sense

plants in the path, for guidance in a vineyard. However, using an arm would require citrus groves

to be even with continuous canopy. There are additional concerns about damaging the tree

branches.

Ultrasonic sensors have been used for guidance in greenhouses, but they require that the

target be perpendicular to the sensor for the ultrasonic waves to be reflected back properly

(Burks et al., 2004). Dead reckoning is widely used in combination with other sensors for

autonomous vehicles. Nagasaka et al. (2002) and Kodagoda et al. (2002) have used rotary

encoders for detecting vehicle position. However wheel slip can cause significant error in

distance measurements.










GPS, in combination with inertial navigation systems, are used in other areas of

agricultural vehicle guidance and precision agriculture. Both Real Time Kinematic (RTK) GPS

and Differential GPS (DGPS) can be satisfactory and allow accurate real time measurement.

Nagasaka et al. (2002), Benson et al. (2001) and Noguchi et al. (2002) have found that RTK GPS

receivers give very accurate results. Ehsani et al. (2003) evaluated the dynamic accuracy of some

low cost GPS receivers with the position information from RTK GPS as reference. They found

that these receivers had an average absolute cross track error around Im, when traveling in a

straight line. GPS cannot be effectively used for positioning in citrus applications, since the

vehicle frequently moves under tree canopy, which could block the satellite signals to the GPS

receiver. Moreover, a system using GPS for guidance requires that a predetermined path be given

for the vehicle to follow. Consequently, significant time has to be spent in mapping its path.

Laser radar (ladar) has been used for ranging and obstacle avoidance. Carmer and Peterson

(1996) discussed the use of laser radar (ladar) for various applications in robotics. Autonomous

vehicle navigation was discussed as a promising application for ladar, due to its ability to

accurately measure position. Gordon and Holmes (1998) developed a custom built ladar system

to continuously monitor a moving vehicle's position. The testing was done indoors. Ahamed et

al. (2004) used a ladar for developing a positioning method using reflectors for infield road

navigation. They tested differently shaped reflectors to determine the accuracy in positioning.

Ladar has been used for navigating a small vehicle through an orchard (Tsubota et al., 2004). A

guidance system using ladar was found to be more stable than using a GPS in a citrus orchard

setting. In the research mentioned above, ladar was tested outdoors in the absence of high dust

conditions. It is expected that the operation of ladar in dust, fog and similar conditions which

block light, would result in reduced performance.









Machine vision has been applied in mobile robots by numerous researchers. Its low cost

and good performance has made it a good candidate for the main guidance sensor. Machine

vision and ladar guidance system performance can be comparable (Subramanian et al., 2004).

Misao (2001) used machine vision for an automatic steering system on a field sprayer. A video

camera was used to acquire the image of the travel path with red targets. Image processing

algorithms determined the distance from the current vehicle position to the target, and then the

actual position to the desired vehicle path was compared and corrected by automatic steering

control. Han et al. (2002) developed a row segmentation algorithm based on k-means clustering

to segment crop rows. This information was then used to steer a tractor. The guided tractor was

able to perform field cultivation in both straight and curved rows. Okamoto et al. (2002)

developed an automatic guidance system for a weeding cultivator. A color CCD camera acquired

the crop row images, and by processing the images in the computer, determined the offset

between the machine and the target crop row. The weeding machine was then steered through the

crop row using an electro-hydraulic steering controller.

A good control system is necessary irrespective of the sensor used for guidance.

Feedforward + PID control worked well for guiding a tractor through crop rows (Kodagoda et

al., 2002). PID control has been used for guiding a grain harvesting tractor (Benson et al., 2003).

In that research, PID was used to calculate the actuator command signal based on the heading

offset. The performance of the controller was comparable to that of manual steering. Cho and Ki

(1999) used fuzzy logic controller and machine vision for guiding an autonomous sprayer

vehicle through orchards. The input information to the fuzzy logic controller was given by both

machine vision and ultrasonic sensors.










To date, there has been minimal success in developing commercial autonomous navigation

systems for citrus grove applications. However, other similar applications have provided

valuable insights for this research. The vehicle used in this research was a John Deere 6410

tractor. Machine vision and laser radar were used as the primary guidance sensors. PID control

was selected for controlling the steering of the autonomous vehicle, because it has performed

very well in previous mobile robotics research.

The overall obj ective is to design a guidance system which will make the tractor capable of

navigating through an alleyway of a citrus grove. Before testing in a citrus grove, it was decided

to test the vehicle in a portable path constructed of hay bales. The following specific obj ectives

were selected for this research:

* Develop electro-hydraulic steering circuit for the vehicle.

* Develop a mathematical model for the vehicle's dynamics and design a PID control for
steering control.

* Develop two algorithms for path finding, one using machine vision and the other using
laser radar.

*Test the guidance system's performance in a test track and confirm guidance in a citrus
grove.

Materials and Methods

A John Deere 6410 tractor was selected for this study. It is well suited for such

environments and might be commonly used in spray applications. The width of the tractor was

about 2.5 m. To accommodate electronic steering, the tractor was retrofitted with a Sauer

Danfoss PVG 32 proportional servo valve. The valve was plumbed in parallel with the existing

hydrostatic steering system, as shown in Figure 5-1. The valve required an analog voltage signal

between 0 and 10 V as input.
























5-terlngl 1, y .L
F T Swic 'P Sr-vo








Circuit

Figure 5-1. Electro-hydraulic retrofit of the tractor for automatic guidance

A Stegmann incremental rotary encoder was mounted on the front steering cylinder for

wheel angle feedback. A common 2.4 GHz P4 processor running Windows 2000 operating

system was used for processing the vision and ladar algorithms. A "Tern 586 engine"

microcontroller was used for executing real-time control of the servo valve and encoder feedback

loop. An amplifier circuit was used to scale the control voltage from the microcontroller to the

requirements of the valve. The guidance system architecture is shown in Figure 5-2.

The encoder was calibrated by turning the wheels to different angles and recording the

number of pulses given by the encoder. Calibration was determined as 7 pulses/ degree of turn.

Power was taken from the tractor' s 12V DC battery and inverted to 120V AC for running

the PC and the ladar. The 12V DC tractor supply was used to power the valve and the

microcontroller.



















Camera


Figure 5-2. Guidance system architecture of the vehicle

The camera used for machine vision was a Sony FCB-EX780S "block" single CCD analog

color video camera with a standard lens and resolution of 640x480. A frame-grabber board was

used to convert the analog signals to digital images. The ladar used was the SICK LMS 200,

which has a maximum sweep angle of 180 degrees and a maximum range of 80 m. RS 232 serial

communication at 115200 Baud was used to communicate with the ladar. A Starfire Differential

DGPS receiver with 30 cm accuracy was used for determining the vehicle dynamics under open

field conditions. The camera, ladar and DGPS receiver were mounted on top of the cab. The

camera and the ladar were mounted at a 45 degree angle looking 1.5 m in front of the tractor as

shown in Figure 5-3.















Figure 5-3. Camera and ladar mounted on top of the tractor cab









For machine vision, color was used as the discriminator for segmenting the path. To

account for the varying weather conditions, images were collected over a period of 6 days in two

months from morning to evening at half hour intervals. Based on this database of images, a

segmentation algorithm was developed as shown in Figure 5-4. From the set of images, three

types of conditions were observed that require the threshold value to be changed. These are

cloudy day when the trees are darker than the path; bright sunny day when the trees are darker

than the path but all pixel intensity values are elevated; and early morning and evening, when the

sunlight causes the trees on one side of the row to be brighter than the path and the trees on the

other side to be darker than the path. Two R, G, B threshold values are chosen from the image

database corresponding to these conditions. To find these R, G, B threshold values, first the set

of images corresponding to the early morning and evening conditions is taken. The highest of the

R, G, B pixel values in the trees on the side which is darker than the path is chosen as the

threshold value. Next the sunny day condition images are taken and the highest of the R, G, B

pixel values in the trees on both sides of the path is taken as the threshold value. To

accommodate these conditions automatically, an adaptive thresholding method was used as

shown in Figure 5-4.



















































Send crror to dae
Mircrocentroller ria
Sb1edria~l conummenti


Figure 5-4. Machine vision algorithm flowchart

Then, a series of morphological operations were performed to clearly segment the path and

a line was fit for the boundaries, using the least squares method. The error was calculated as the

distance from the path center. Camera calibration was done to convert pixel distance to true

distance. This was done by marking known distances on the ground in the camera' s field of view




178










and the number of pixels corresponding to these distances was counted. To eliminate fluctuation

in path tracking due to natural light and camera noise, a time based outlier elimination routine

was added to the error calculation. This routine assumes that the error data in a short interval of

time of the order of 1 sec is normally distributed. When a new error data is found to lie outside

two standard deviations of the data from the previous time interval, it is determined as noise and

is not used to change the steering for the present time step. If however successive error data are

in the same range, then they are taken into account for correcting the steering. An example of

grove path estimation using the machine vision algorithm is shown in Figure 5-5. Note the path

boundary depicted by the superimposed straight lines along the tree boundary. For navigation

experiments on the test path, the color threshold for the machine vision algorithm was adopted

for detecting hay bales instead of trees.










Figure 5-5. Machine vision results for citrus grove alleyway. A) Raw image. B) Tree canopy
segmentation. C) Path boundary.

The laser radar navigation algorithm employed a threshold distance based detection of hay

bales. The flowchart of the algorithm is shown in Figure 5-6. The plot of the radial distance

measured by the ladar for different angles, when driving through the test path, is shown in Figure

5-7. The discontinuities in the plot indicate the location of the hay bales. While scanning the field

of view using laser radar, if such discontinuities are found in the data, they are marked as hay

bales. The path center was determined as the center of the path, between the hay bales on either

side.





























Continue Moving
window and look
for discontinuity


Find path center as
mid point between
hay bales


Figure 5-6. Ladar algorithm flowchart













2500








0 20 4 6 0 10 20 10 16 8
Angl (Derees
Figre5-. adil isane easre b te lse rda i th hy al pth
Fo tedeemiaio f h vhcl' dnmis oe lo feuec rsonetet wr












fielde without Rany dtreeso buildings blokn the lsignraal to the DGPS receier Thercir

comniaedwt the PCvi R 22 eralcomuiation. Tfhe steering turnmcs oe lo frequency rsos et wase

performed fro 0.0 Hze toel 0.3 Hzi. Fh reqecio nies blwa 0.07 Hz causep d thetaor t ee fr out o


the tesct are wand freun encies above 0.3 Hzwerel though to be dtrimentale tiuoid t he tractor.

poIti was observed th rat thveile'poionnfraonbtne using the DGPS receiver. h et eecnutdi noe








showed a sinusoidal response at the same frequency as the steering angle. The lateral

displacement of the vehicle for a given change in steering angle was recorded as the gain, where

the lateral displacement is the displacement of the vehicle in a direction perpendicular to the

direction of initial motion. The lateral displacement of the vehicle was calculated from the


3000













latitude and longitude data obtained from the DGPS receiver using ArcView 3.3 (ESRI Inc.,


Redlands, CA). The accuracy of ArcView 3.3 was verified with actual manually measured


distance for a few test runs. The gain was plotted against frequency in a bode plot as shown in


Figure 5-8. The response was observed to be linear. Visually fit lines were used to model this


response.


O 4mphAcZual
+ i0mphActual
r,
t O TmphActual
s
--------------;------------I-----------+
III 10mph Model
~ Tmph Model
....~1...~..~.~..:.~..~.~..~....~....~..
s ... 4vnph Madal
2*

'
~.
~'af
o I
~~..
~Zc
Q
.,
'
`
I~r

L- '"'~.." o'rr;l----------------------I-
r
'~'.. CI
.
"~
1
''
'I
r
3.
~~....~.......~:...~....~..~I~..~....~..
)i
1.,


so


25


20













15


Frequency (Hz)

Figure 5-8. Open loop theoretical and actual frequency response of the sinusoidal vehicle
dynamics test (Phase = -180 deg)


The sinusoidal vehicle dynamics data was used to calculate the transfer function between


the steering angle and lateral displacement as


Lateral Displacement(s) k
G(s) (Eq. 5-1)
Steering angle changes) s ^2



The gain k was speed sensitive and was determined as 0.02 (1.8 m/s), 0.07 (3.1m/s) and


0.13 (4.4 m/s) This model agreed with the model reported by Stombaugh et al. (1999).


Based on the determined theoretical model, a PID control system was designed (Figure 5-


9). The PID gains were tuned by simulation in Simulink, using the Ziegler Nichols method of










tuning for a closed loop system. Tuning was optimized for zero steady state error and low

overshoot. Figure 5-10 shows the 1 m step response when simulated for the 3.3 m/s speed.


Enror in


Desired
Annle


Desired

Displacement


Lateral

Displacement


Figure 5-9. Simulated vehicle control system block diagram


120


100


S80-


E 60-


O~ 10 20 30 40 50 60
Time (sec)
Figure 5-10. Simulated Im step response of the vehicle at 3.3 m/s

Experimental Procedure

Typical citrus grove alleyways are practically straight rows. It was felt that testing the

vehicle in straight and curved paths at different speeds would be a more rigorous test of the

guidance system's performance than testing in straight grove alleyways. For this purpose, a

flexible wall track was constructed using hay bales to form a test path. The hay bales provide a










physically measurable barrier, which aids in operating the ladar guidance system and color

contrast with grass which is useful for vision based guidance. In addition, when testing a large

tractor, minor losses would occur if the tractor runs over a barrier of hay bales. The control

algorithms used with machine vision or ladar based guidance are practically identical for either

test track. The only modification in the vision algorithm was the threshold value for image

segmentation. These threshold values were determined for both paths using techniques described

earlier. The PID controller and the encoder performance are independent of the physical

environment around the vehicle. The hay bale width was 45 cm and the boundary height was 90

cm after stacking hay bales two deep. Two types of path were used, a straight path and a curved

path. The length of the straight path was 22 m. A 10m extension was added to the straight path

to form the curved path. The radius of curvature was 17 m at the curve. The path width was 3.5

m throughout the length. The paths used in this study to evaluate the guidance system

performance are shown in figures 11 (A) and 11 (B).













Figure 5-11. Guidance system test path. A) Straight. B) Curved.

Experiments were conducted to check the robustness and accuracy of the vision and laser

radar guidance systems in navigating the different test paths at different speeds. Experiments

were conducted for three different speeds, 1.8 m/s, 3.1 m/s and 4.4 m/s. These speeds









corresponded to the perceived slow, medium and fast speeds for a tractor operating in a grove.

The speeds were measured using an ultrasonic ground speed sensor.

The vehicle was manually driven to the starting position, resulting in different initial offset

and heading angles for each repetition. Once positioned, the vehicle was started under

autonomous control and allowed to navigate down the path. Three repetitions were performed for

each experiment. A rotating blade was attached to the draw bar, which marked a line on the

ground as the vehicle moved. This line represented the path center traveled by the tractor. The

errors were manually measured after each run using the distance from the marked line to the hay

bale boundary. The error measurements were accurate to 0.1 cm. The measurements were taken

at regular intervals of 30 cm. Due to the unpredictable nature of the starting and ending data, the

first 5% and last 5% of the data record was discarded for statistical analysis. However, the plots

show the full data record, which illustrates the random nature of the vehicle starting position.

Data points were taken from each run as described above in order to calculate the path root

mean square error, standard deviation, maximum error and average error. The RMS error

equation is shown in Equation 2


RMS eror=(Eq. 5-2)


Where, e, is the path error and N is the total number of data points. These measures were

used to compare the performance of the two guidance sensors, the two different path

configurations and the three speeds.

After achieving satisfactory performance in the test path, trials were conducted in an

alleyway of the citrus grove. For initial grove trials, only machine vision was used due to

machine vision easily adapting to grove conditions. Whereas, ladar requires that additional data










filters be designed to compensate for noise due to random orientation of the tree leaves. Future

development is planned to test ladar in the grove. The machine vision guidance system was used

to guide the tractor through an alleyway of a citrus grove on the University of Florida campus as

shown in Figure 5-12. The average path width was 4.6 m. The tractor was driven to the grove

and started in the automatic mode at the beginning of an alleyway. The guidance system was

allowed to guide the tractor through the entire length of the alleyway. The performance was

visually observed.
















Figure 5-12. Vehicle in the citrus grove alleyway

Results and Discussion

Tests were conducted on each of the sensor and track conditions to observe the behavior of

the vehicle under various speeds. During each trial run, the path position and error were recorded

manually using a tape measure. The RMS error, average error, standard deviation of the error

and the maximum error were calculated for each experiment, by averaging the data over the three

repetitions of each condition. The results were then compared for the different conditions. The

path error, which is the deviation from the centerline of the aisle way, was plotted over the length

of the test path for the different conditions. Path errors were adjusted for local variation in hay

bales. The performance measures are shown in Table 5-1 for straight path and Table 5-2 for











curved path. It should be noted that Table 5-2 data is broken down to the straight, curved and


overall path errors.


Table 5-1. Performance measures of the vehicle's guidance system obtained from the
experiments conducted in the straight test path
Path Sensor Speed Average Maximum Standard RMS Error
(m/s) Error Error (cm) Deviation (cm)
(cm) cm)
1.8 2 3.9 0.8 2.2


Vision 3.1 2.8 5.9 1.2 3.1


4.4 2.8 6.1 1.5 3.2
Straight
1.8 1.6 3.3 0.7 1.8


Laar 3.1 1.6 3.9 0.8 1.8

4.4 3.1 6.1 1.6 3.5



Figure 5-13 shows the performance of the machine vision guidance system at the three


different test speeds. The average error was less than 3 cm for all speeds as noted in Table 5-1.


The performance was better at 1.8 m/s than at the two higher speeds. As shown in Figure 5-13,


the PID control is attempting to drive the error to zero. The RMS error is 3.2 cm at the fastest


speed and demonstrates a good control performance visually similar to human driving. The


machine vision guidance performed well in the straight path. The performance was better at


lower speeds.


id889 106 -145 1422 1SW 177814
-10
-15
-20
Travel Distance (cm)


\*' 8916 2 12\l~~/ V j:;L\13



Travel Distance (cm)


-5
;40


















1/s 356 1245 14 178 23







Figure 5-13. Performance ofthen machine vision geuidnce in thwe stracingh path.s A) 1.8 m/. ) .

flowers m/s. C)ee h aa ro s1.%hihra .c hntaeiga 4.4 m/s. h

Figure 5-14o shgowsthe a pefrmnanc ofzwer the laser radar guidane syse at te thre different t

sp egreds Oneagation the PID control isdriving the erort aroundctn zero Th average erorat the

slowest speda (1.6s cm)e isalms thatlf th error t the highst speed (3.1 cm). Laser radaaruis a moe

accurtemeauigdvc than h machine vision rudnessesutinag ivn lower itervackin errrs atg theed to



mac, hine vision agridcsthm wasrun t 20z, werea the laser radar rudnefrsesh rate was only 6cHz at






vision guidance system is not severely affected. By using a high speed serial communication

card (500 KBaud), the laser radar refresh rate can be increased and therefore its performance at

speeds 4.4 m/s and higher, can be improved. None the less, the RMS error was only 3.5 cm at the

fastest speed.



20 20
15l 15*1 4





Travel Distance (cm) A Travel Distance (cm) B

















Travel Distance (cm)
Figure 5-14. Performance of the laser radar guidance in the straight path. A) 1.8 m/s. B) 3.1 m/s.
C) 4.4 m/s.

Figure 5-15 shows the performance of the vision and laser radar guidance systems at a

speed of 3.1 m/s in the curved path. The performance of the two systems was visually as good as

a human driving the tractor. The vision guidance had an average error over the entire track of 2.8

cm whereas the laser radar guidance system had an average error of 2.5 cm as shown in Table 5-

2. The maximum error, rms error and standard deviation of machine vision based guidance,

when run at 3.1 m/s in curved path seems to have done better than in the straight path. It is to be

noted that the curved path consists of a straight section followed by a curved section. The

camera' s wide field of view is smaller than that of the ladar and therefore when the vehicle gets

to the curved section, only one side of the path boundary is visible in the field of view. The

machine vision tries to guides the vehicle at a constant distance from the visible boundary. This

can be observed in Figure 5-15A (vision in curved path at 3.1 m/s). Since the curve starts at some

where around 2100 cm travel distance, Figure 5-15A shows that the vision-based PID control

does not attempt to drive the error to the center of the path once the tractor enters the curve.

However, the ladar-based PID control in Figure 5-15B does continue to cross the zero error path

center in the curve section. The maximum error, rms error and standard deviation of error in this

curved section are lower than that for the straight section. When averaged over the entire path,

the error parameters appear lower for the curved path than for the straight path. The laser radar

guidance system's overall accuracy was slightly more accurate than the vision guidance when










considering average error. As before, it performed better in the straight path section of the path,

but was less accurate in the curve than the vision system. It also seems to exhibit larger swings

about zero error than the vision system as evident by the larger maximum error. This may be due

to some natural dampening that occurs in the less accurate vision measurement algorithm. This

can be attributed to the fact that the laser radar is more accurate at detecting the path center than

the vision guidance.









Travel Distance (cm) A Travel Distance (cm) B


Figure 5-15. Performance in the curved path at 3.1 m/s. A) machine vision guidance. B) laser
radar guidance.

Table 5-2. Performance measures of the vehicle's guidance system obtained from the
experiments conducted in the curved test pahat 3.1m/s
Sensor Path Average Maximum Standard RMS Error
Error Error (cm) Deviation (cm)
(cm) (cm)
Straight section (22nt) 2.8 5.1 1.2 3.1
Vision Curved section (10nt) 2.7 3.9 0.4 2.7
Overall (32nt) 2.8 5.1 0.9 2.9
Straight section (22nt) 1.6 3.9 0.8 1.8
Ladar Curved section (10nt) 4.1 6.1 2.9 4.9
Overall (32nt) 2.5 6.1 1.6 2.9

The good performance in the hay bale track encouraged trials in the citrus grove alleyway.

The guidance system successfully guided the tractor through the alleyway. Initial trials using

machine vision for guidance showed a lot of promise. The machine vision algorithm clearly

segmented the path to be traversed. However, the non uniformity of the canopy surface caused

the controller to over compensate for the undulation in canopy surface resulting in a more zigzag










transit of the alleyway. Control enhancements are planned to further improve the guidance

system performance in the grove.

Conclusions

Machine vision and laser radar based guidance systems were developed to navigate a

tractor through the alleyway of a citrus grove. A PID controller was developed and tested to

control the tractor using the information from the machine vision system and laser radar. A

flexible curvature track was built using hay bales as boundaries and the vehicle was tested under

straight path and curved path configurations. Tests were conducted at three different speeds of

1.8 m/s, 3.1 m/s and 4.4 m/s. Path tracking comparisons were made between the vision-based

control and a ladar-based control. The traversed path was marked on the ground and measured

manually.

Error comparisons were made using RMS error and mean instantaneous error. It was found

that the ladar-based guidance was the better guidance sensor for straight and curved paths at

speeds of up to 3.1 m/s. Communication speed between the laser radar and the computer was a

limiting factor for higher speeds. This problem can be solved by a faster communication speed

using a high speed serial communication card in the PC. This card was not available during this

research. Machine vision based guidance showed acceptable performance at all speeds and

conditions. The average errors were below 3 cm in most cases. The maximum error was not

more than 6 cm in any test run. These experiments demonstrated the accuracy of the guidance

system under test path conditions and successfully guided the tractor in a citrus grove alleyway.

However, additional testing is needed to improve the performance in the citrus grove. It was

proposed that a control scheme, which used both machine vision and laser radar, may provide a

more robust guidance, as well as provide obstacle detection capability.









CHAPTER 6
SENSOR FUSION USINTG FUZZY LOGIC ENHANCED KALMAN FILTER FOR
AUTONOMOUS VEHICLE GUIDANCE

Introduction

There is a current need in the Florida citrus industry to automate citrus grove operations.

This need is due to reduction in the availability of labor, rising labor cost and potential

immigration challenges. Autonomous vehicles would be an important part of an automated citrus

grove. Autonomous vehicles can operate accurately for a longer duration of operation than when

using a human driver. In a scenario where an automated citrus harvester is in operation, the

operator can monitor both the harvester and the vehicle instead of being solely the driver. An

autonomous vehicle with the ability to navigate in the middle of the citrus alleyways with an

accuracy of 15cm deviation from the center would be beneficial. Numerous autonomous vehicles

for agricultural applications have been described in the literature. The guidance system

developed by Noguchi et al. (2002) was for operation in paddy fields for transplanting rice

whereas the guidance system developed by Nagasaka et al. (2004) was tested in soybean fields.

The development of an autonomous vehicle guidance system for citrus grove navigation using

machine vision and laser radar (ladar) based guidance systems is discussed in Subramanian et al.

(2006). Both machine vision and ladar based guidance performed with a maximum error of

6.1cm in test paths. The machine vision based guidance system had also guided a tractor through

an alleyway of a citrus grove, keeping the vehicle visually in the center of the path with no

collisions with the trees. It was found that ladar based guidance was more accurate than vision

based guidance in well defined paths within a speed of 3.1m/s whereas the vision based guidance

was able to keep the vehicle in the middle of the path in several types of paths but less accurate.

The uneven tree canopy in citrus grove alleyways pose problems for guidance when using vision

and ladar separately. With vision based guidance configuration used in this research, for










example, with a forward facing camera, branches closer to the vehicle than the camera' s field of

view are not accounted for in the vision based guidance; navigating locations near the end of the

row is not feasible with the vision system configuration as the last trees go out of the field of

view of the camera. Ladar mount configuration used in this research, with a wider field of view

and close range is useful for guidance in these situations. However, in a given alleyway, it is not

uncommon to find a few trees absent. In such cases, ladar based guidance gives erroneous paths.

These two guidance sensors provide complementary information for guidance. This paper

presents the research further undertaken to fuse the information from these sensors to make the

composite information from them more robust and make the overall guidance system more

reliable. The development of the Kalman filter for fusion is first discussed followed by a

description of the fuzzy logic system to augment the filter. Finally the experiments conducted to

test the performance of the fused guidance system are described.

Machine vision is a useful sensing methodology for guidance. One particular

implementation has been used in this research (Subramanian et al., 2006). The method is easily

adaptable from a custom designed test track to the citrus grove environment. Its long range aids

in faster decision making before a hard turn is to be performed or for faster obstacle detection

and is quite robust to minor variations in the path boundaries. However, it lacks the centimeter

level accuracy of laser radar in detecting the distance to close range path boundaries or obstacles.

This accuracy is useful in keeping the vehicle accurately positioned in the path. However, this

accuracy makes the information noisy if there are minor variations in the path boundaries. This

was a drawback in adapting the ladar based guidance to the citrus grove environment where the

foliage caused large variations in the data about the path boundary, obtained from the ladar.









Fusing the complementary information from these sensors, namely the long range information

from vision and short range accuracy of the ladar, would be beneficial.

Several methods for fusion have been reported in the literature. These include, but are not

limited to, neural networks (Davis and Stentz, 1995; Rasmussen, 2002), variations of Kalman

filter (Paul and Wan, 2005), statistical methods (Wu et al., 2002), behavior based methods (Gage

and Murphy, 2000), voting, fuzzy logic (Runkler et al., 1998) and combinations of these (Mobus

and Kolbe, 2004). The process of fusion might be performed at various levels ranging from

sensor level to decision level (Klein, 1999).

In our research, the tree canopy causes the sensor measurements to be fairly noisy. So the

choice of method was narrowed down to be the one that could help in reducing the noise and also

aids in fusing the information from the sensors. Methods such as neural network, behavior based

methods; fuzzy logic and voting are not widely used primarily for reducing noise. Kalman

filtering is a widely used method for eliminating noisy measurements from sensor data and also

for sensor fusion. Kalman filter can be considered as a subset of the statistical methods because

of the use of statistical models for noise. Paul and Wan (2005) used two Kalman filters for

accurate state estimation and for terrain mapping for navigating a vehicle through unknown

environments. The state estimation process fused the information from three onboard sensors to

estimate the vehicle location. Simulated results showed feasibility of the method. Han et al.

(2002) used a Kalman filter to filter DGPS (Differential Global Positioning System) data for

improving positioning accuracy for parallel tracking applications. The Kalman filter smoothed

the data and reduced the cross tracking error. Based on the good results obtained in the previous

research for fusion and noise reduction, a Kalman filter was selected as the method to perform

the fusion and filter the noise in sensor measurements.









The use of a Kalman filter with fixed parameters has drawbacks. Divergence of the

estimates is a problem which is sometimes encountered with a Kalman filter, wherein the filter

continually tries to fit a wrong process. Divergence may be attributed to system modeling errors,

noise variances, ignored bias and computational round off errors (Fitzgerald, 1971). Another

problem in using a simple Kalman filter in our research is that the reliability of the information

from the sensors depends on the type of path in the grove. For example, while navigating in the

grove, if in a given position, where a tree is missing on either side of the path, the information

from the ladar is no longer useful and guidance has to rely on the information from vision alone.

Therefore, if a white noise model is assumed for the process and measurements, the reliability

factor of a sensor in the Kalman filter has to be constantly updated. Abdelnour et al. (1993) used

fuzzy logic in detecting and correcting the divergence. Sasladek and Wang (1999) used fuzzy

logic with an extended Kalman filter to tackle the problem of divergence for an autonomous

ground vehicle. The extended Kalman filter reduced the position and velocity error when the

filter diverged. The use of fuzzy logic also allowed a lower order state model to be used. For our

research, fuzzy logic is used in addition to Kalman filtering to overcome divergence and also to

update the reliability parameter in the filter. Sasladek and Wang (1999) used fuzzy logic to

reduce divergence in the Kalman filter. In this research, apart from using fuzzy logic to reduce

divergence, fuzzy logic is also used to update the usability of sensors in different environmental

conditions as discussed in sensor supervisor sub-section. Sasladek and Wang (1999) presented

simulation results of their method, while this research implements similar methods on a real

world system for citrus grove navigation application. Performance results are presented.

The main obj ective of this research is to fuse the information from machine vision and

laser radar using fuzzy logic enhanced Kalman filter to make the autonomous vehicle guidance










system more reliable in a citrus grove than when using the sensors individually. The following

specific obj ectives were chosen:

* Develop fuzzy logic enhanced Kalman fi1ter fusion method.

* Confirm operation of the method by simulation.

* Implement the method on a tractor fitted with the required sensors.

* Test the navigation performance on custom designed test track.

* Test the navigation performance in citrus grove alleyways.

Materials and Methods

The vehicle developed by Subramanian et al. (2006) consisted of machine vision and laser

radar as the main guidance sensors. A common PC with a 2.4GHz P4 processor using Windows

XP operating system processed the information from the sensors and sent the error to a

microcontroller (Tern, CA, USA). The microcontroller controlled the steering using a PID

control system. Details of this controller are given in Subramanian et al. (2006). An encoder fed

back steering angle measurements to the microcontroller. A tri-axial inertial measurement unit

(IMU) (3DM, Microstrain, VT, USA) and an ultrasonic speed sensor (Trak-Star, Micro-Trak

systems, inc. MN, USA) have been added to the existing system to aid the guidance and sensor

fusion. The IMU provides the vehicle heading and the speed sensor provides the vehicle speed.

These help in estimating the error using the Kalman filter. Communication with both of these

sensors is through RS232 serial communication. The IMU was mounted inside the tractor cabin

on the roof just below the ladar and camera. The speed sensor was mounted below the tractor. A

special serial RS422 communication port was added to the PC to enable a ladar refresh rate of

30Hz. The architecture of the maj or components involved in the fusion based guidance system is

shown in Figure 6-1.



















s" Kilter


Steering


vision


Figure 6-1. Fusion based guidance system architecture

The vision and ladar algorithm are described in detail in Subramanian et al. (2006). The

vision algorithm uses color based segmentation of trees and ground followed by morphological

operations to clean the segmented image. The path center is determined as the center between the

tree boundaries. The error is calculated as the difference between the center of the image and the

path center identified in the image. The error is converted to real world distance using prior pixel

to distance calibration. To calculate the required heading of the vehicle, a line is fit for the path

center in the image representing the entire alleyway. The angle between this line and the image

center line is determined as the required heading for the vehicle to navigate the alleyway with

low lateral error. The ladar algorithm uses a distance threshold, to differentiate obj ects from the

ground. An obj ect of a minimum width is identified as a tree. The mid point between the trees

identified on either side is determined as the path center. The errors measured using vision and

ladar are adjusted for tilt using the tilt information from the IMU. The information from the

sensors is used in the fusion algorithm and the resulting error in lateral position is passed to a

PID control system which controls the steering angle to reduce the error to zero.









Kalman Filter

A Kalman filter is used to fuse the information from vision, ladar, IMU and speed sensor.

The models and implementation equations for the Kalman filter (Zarchan, 2005) are:


















Figure 6-2. Vehicle in the path with the state vector variables

State transition model

The state transition model predicts the coordinates of the state vector X at time k+1 based

on information available at time k.

x(k+1)= A x(k)+ w(k) (Eq. 6-1)

where k denotes a time instant, x(k) C R4 denotes a state vector composed of the lateral

position error of the vehicle in the path which is the difference between the desired lateral

position and the actual lateral position, denoted by d(k) C R, the current vehicle heading denoted

by B (k) C R, the required vehicle heading, denoted by BR(k) C R and the vehicle linear velocity,

denoted by v(k) C R. It is to be noted that only the change in heading is used for estimating the

position error, which affects the overall guidance. The absolute heading is included in the filter

to remove noise from the IMU measurements. Figure 6-2 shows the vehicle in the path with the

state vector variables.










x(k) = [d(k) B (k) 0 R(k) v(k)]T

The state transition matrix, A(k) C R4x4 is defined as

1 0 0 t *SinAO
010 0
001 0
000 1

where t denotes a time step and w denotes the process noise which is assumed to be a

Gaussian with zero mean. AO denotes the change in heading and this value is calculated from the

filtered IMU measurements for each time step k. The use of a variable sensor input such as AO

in the state transition matrix is consistent with the method described in Kobayashi et al. (1998).

The state estimate error covariance matrix PC~ R4x4 giVCS a measure of the estimated

accuracy of the state estimate x(k+1).

P(k+1) = AP(k)AT + Q(k) (Eq. 6-2)

where QC~ R4x4 denotes the covariance matrix for process noise w

The estimation process is difficult to directly observe, hence the process noise covariance

is uncertain. Therefore the initial variance values, which are the diagonal elements in the matrix,

were assumed as follows: variance for process position noise 2cm, variance for process current

heading noise 0.01degree, variance for required heading noise 0.01degree and variance for

process velocity noise 10-4 m/S. These values were intuitively assumed to be the amount of noise

variance that the process could add. As discussed further in this paper, the process noise

covariance matrix Q, is updated at each step of the operation of the Kalman filter. Therefore, the

initial assumptions of the matrix values do not significantly affect the overall guidance of the

vehicle. These sensor measurements were assumed to be independent of the other sensors. This

is done for simplicity and later it is observed from simulations and experiments that the guidance









of the vehicle, with this assumption, is accurate enough to keep the vehicle close to the alleyway

center. Hence the covariance values, which are the off-diagonal elements, are zero.

qx 0 0 0 2 0 0 0
0 q, 0 0 0.01 0 0
0 0 q, 0 0 0 0.01 0
0 0 0q, O 0 0 10-4

Measurement model

This model defines the relationship between the state vector, X and the measurements

processed by the filter, z.

z (k) = H x(k)+u(k) (Eq. 6-3)

where z(k) C R5 denotes a state vector composed of measured values of the lateral position

error of the vehicle in the path from vision algorithm, denoted by xc(k) C R, the lateral position

error of the vehicle in the path from ladar algorithm, denoted by xL(k) C R, the required heading

of the vehicle determined from vision algorithm, denoted by B c(k) C R, the current heading of

the vehicle measured using IMU, denoted by B IMU(k) C R and the vehicle linear velocity

measured using the speed sensor, denoted by v(k) C R.

z(k)= [xc(k) xL(k) Oc(k) 01xU(k) v(k)]T

The observation matrix, H(k) is defined as

11000
00100
H=
00010
00001

The observation matrix relates the state vector x with the measurement vector z. The first

two rows of H relate the lateral position errors from vision and ladar, the third and fourth rows

relate the angles from vision and IMU and the last row relates the velocity. The measurement









noise, denoted by u(k) is assumed to be a Gaussian with zero mean with standard deviations

estimated experimentally with the different sensors.

Filter gain

The filter gain G is the factor used to minimize the posteriori error covariance P.

G(k) = P(k+1) HT (H P(k+1) )HT + R )1 (Eq. 6-4)

where R = measurement noise covariance matrix

To determine R, the vehicle was set stationary in the middle of the path, each sensor

(except the speed sensor) mounted on the vehicle was turned on independently and the

information from the sensors was collected for 30 seconds. For the camera, the vision algorithm

(Subramanian et al., 2006) was run for 30 seconds and the path error and the heading angle

determined was recorded. For the ladar, the ladar path segmenting algorithm (Subramanian et al.,

2006) was run for 30 seconds and the path error measurements were recorded. For the IMU, the

sensor was run for 30 seconds and the heading angle measured by the sensor was recorded. For

the speed sensor, the vehicle was run at 3 different constant speeds and the speed sensor

measurement was collected for 30 seconds. The measurement noise covariance matrix, denoted

by R, was determined from these recorded measurements and was found to be

a, 0 0 0 0 1.07 0 0 0 0
0 o, 0 0 0 0 0.15 0 0 0
R = 0 a, 0 0 I=I 0 0 0.0017 0 0
0 0 0 0,,,,O 0 0 0 0.O001 0
0 0 0 0 a,, O 0 0 0 0


where the variance of the error in lateral position of the vehicle, denoted by as, was

determined from the vision algorithm measurements, the variance of the error in lateral position

of the vehicle, denoted by a,, was determined from ladar algorithm measurements, the variance




































Time update
1) x(k+1)=A x(k)
2) P(k+1)= A P(k) AT + Q


of the error in heading, denoted by a,, was determined from vision algorithm, the variance of

the error in heading, denoted by "0,,,,,,was determined from the IMU measurements and the

variance of the error in velocity denoted by a,, was determined from the speed sensor

measurements. xc and 6 c are measurements from the same camera. However, the vision

algorithm uses different regions of the image to estimate them. XC, XL, B IMU and v are

measurements from different sensors operating independent of each other. These measurements

were assumed to be statistically independent. Therefore, the different sensor measurements were

assumed to be statistically independent of each other. Hence the covariance values are zero. The

variance of the velocity measurements was zero. This can be attributed to the low resolution

(1mph) of the speed sensor.

Predict


Measurement update
1) G(k)=P(k+1)H' (H P(k) H' + R)~
2) x(k+1)= x(k) +G(k)(Z(k)-H x(k))


) 3) P(k+1)=(I-G(k) H) P(k+1)|

Initial
x(k),P(k), QUpdt
Figure 6-3. Kalman filter operation

Figure 6-3 shows the operation of the Kalman filter. The fi1ter estimates the process state x

at some time k+1 and estimates the covariance P of the error in the estimate as shown in the time

update block. The fi1ter then obtains the feedback from the measurement z. Using the fi1ter gain

G and the measurement z, it updates the state x and the error covariance P as shown in the

measurement update block. This process is repeated as new measurements come in and the error

in estimation is continuously reduced.










Reliability Factor of Primary Guidance Sensors in the Kalman Filter and Fuzzy Logic
Sensor Supervisor

As discussed previously, ladar is accurate at short range for the given mounting

arrangement and machine vision is good at providing overall heading of the vehicle. By

experimentation in a citrus grove, the following observations were made: tree foliage is highly

variable, trees can be absent on either or both sides of the path and some trees can be small

enough for ladar to not recognize them as trees. In such a variable path condition, it is not

feasible to have constant noise values for the ladar and vision. For example, when the ladar does

not recognize trees on either side, guidance is more reliable when the process noise for the ladar

is increased to a high value such that vision takes over as the only guidance sensor and for

example when the vehicle gets to the end of a tree row, vision is not useful anymore and so ladar

can be made the sole guidance sensor to cross the last tree. A fuzzy logic system was used to

decide the reliability of the sensor. The reliability is changed in the Kalman filter by changing

the measurement noise covariance matrix R. This is discussed in the following section.

A fuzzy logic based supervisor is used to decide which sensor is more reliable at different

locations in the grove alleyway. The input to the fuzzy logic supervisor is the horizontal distance

of the vehicle center line from the trees on either side of the vehicle. Both vision and ladar input

their corresponding distance values. Altogether, there are four input values; vision left tree

distance, vision right tree distance, ladar left tree distance and ladar right tree distance. These

inputs are divided into three linguistic variables: reasonable, unreasonable and zero (Figure 6-4).

Vision Left In ren e | Vision
VisonRiht uzzifier Defuzzifier
Ladar Left Lar
Ladar Right aI uni noI I Both

















0' I F O I I
Im 2m 3m -3cm -1 0 0.2 2cm
X Decision


Figure 6-4. Fuzzy logic for sensor supervisor. A) Fuzzy logic architecture. B) Membership
functions.

The input membership function relates the input x to the linguistic variables reasonable,

unreasonable and zero (Figure 6-4 B). The meaning of the linguistic variables is literally whether

the distance from the tree row is reasonable or not. A zero membership for the ladar indicates the

presence of obstacle in the path and for vision, the end of the row. x takes the value of zero for 0

to Im, Im to 3m for reasonable and any value greater than 2m for unreasonable. The ranges were

intuitively chosen to be the possible values. Normalization of the linguistic variables was

performed at a scale from 0 to 1. During the presence of an obstacle, the ladar takes precedence

till the obstacle is avoided. The rule set for the supervisor is shown in Table 6-1. Each rule

received a corresponding normalized membership value from the input membership function x.

Based on the linguistic variables chosen from each x, the corresponding rules are fired. From the

rules fired for a given instant, the output membership function is chosen. The membership value

was used to find the crisp value for the output. The output membership function decision is

shown in Figure 6-4 B, which relates the rules fired to the linguistic variables vision, ladar and

both. These variables represent the decision made as to whether vision or ladar is given higher

priority or if both should be given equal priority. Defuzzification was done using the center of

gravity method. The crisp values of the decision, is taken as the measurement noise covariance

value for that sensor. For example, if a decision of ladar is obtained with a crisp value of 1, cr,










is update to 1 and os is chosen as 0. If a decision is made as both with a crisp value of -0.5,

os is chosen as 0.5 and an is chosen as (-0.5/-1)*0.2 = 0.1. The positive or negative values

only indicate whether the decision is vision or ladar. Only the absolute values are used to update

R. The range of crisp values for the decision was chosen by performing simulations and

determining what values provide good results.

Table 6-1. Fuzzy logic rule set for sensor supervisor
Vision Left/Right Ladar Left/Right Decision
Reasonable Both
Reasonable/Zero Ladar higher
Zero Stop
Reasonable..
Unreasonable Vision
Unreasonable/Zero Ladar
Reasonable/Unreasonable Vision higher
Unreasonable Ladar
Reasonable/ Zero Ladar higher
Zero Reasonable Ladar
Unreasonable/ Zero Ladar
Reasonable/Unreasonable Both


Divergence Detection and Fuzzy Logic Correction

The innovation vector z'(k) (Klein, 1999), is defined as the difference between the

measured vector and the estimation of the measurement vector given by equation 6-5.

z'(k)= z(k) H x(k) (Eq. 6-5)

For an optimum Kalman filter, the innovation vector is a zero mean Gaussian white noise

(Abdelnour et al., 1993). Therefore, the performance of the Kalman filter can be monitored,

using the value of the innovation vector. Deviation of z' from zero by more than 5% was taken

as the indication of reduction in performance of the filter leading towards divergence.

Divergence could be corrected by updating the process noise covariance matrix Q,

depending on the innovation sequence. Fuzzy logic (Passino and Yurkovich ,1998; Subramanian










et al., 2005) was used to perform this for two parameters qx and qB R. The fuzzy logic

architecture and membership functions are shown in Figure 6-5.



zV- Z. Inference O
.n in Defuzzifier
Z XL~ Fuzzifier



A

Negative Positive Zero Positive





0-1500 -500 00 500 150 OX=0 X=4
Z'(Xc XL, B c $ 0=.


Figure 6-5. Fuzzy logic for correcting divergence. A) Fuzzy logic architecture. B) Membership
functions for updating Q

The xc, xL and B parameters of the innovation vector z' are the input to the fuzzy logic

system. The ranges of the input are from -15% to 15%. These values were chosen by trial and

error by performing simulation. Normalization of the linguistic variables was performed at a

scale from 0 to 1. Based on the above argument for correction, a set of if-then rules were

developed by experimentation. Each rule received a corresponding normalized membership

value from the input membership function z'. Based on the linguistic variables chosen from each

z', the corresponding rules are fired. The rule set is shown in Table 6-2. For example, consider

the second row in the Table where z'x is negative and z' ec is positive, this indicates that the error

in position has reduced compared to the previous time step but the vehicle is heading in the

wrong direction, then the process noise for position qx is zero and the process noise for heading

q8R is positive. In the rule set, a single measurement value for position, z'x, is given instead of










the two different values, z' xe and z' XL This is because, the sensor supervisor described earlier,

picks one of the sensors for use in the filter when the other is unreasonable. In the cases where

one sensor is given higher priority, the position measurement of that sensor alone is used in the

rule set and in the cases where both sensors are selected by the supervisor; the position

measurement of ladar alone is used as it has lower measurement noise. This was done to reduce

the complexity of the rule set.

Table 6-2. Fuzzy logic rule set for divergence correction
z x z oc qx 98R
Negative Negative Zero Zero
Negative Positive Zero Positive
Positive Negative Positive Zero
Positive Positive Positive Positive

From the rules fired for a given instant, the output membership function is chosen. The

membership value was used to find the crisp value for the output. Defuzzification was done

using the center of gravity method (Passino and Yurkovich, 1998). The crisp values of the

process noise covariance are then updated in the process noise covariance matrix Q. The range of

crisp values was chosen by trial and error by performing simulations and determining what

values provide good results.

For the parameters, qoc and qv, a linear relationship was followed such that qH, and qv

were increased proportional to z'1-z of 6c and v respectively. The speed of the vehicle obtained

from the speed sensor was used to scale the output to the steering control. Higher speed requires

more turn and vice versa. The transfer function between the required lateral displacement of the

vehicle, to reduce the error to zero as determined from the fusion algorithm, and the steering

angle is given by equation (6) and the gains were determined experimentally for speeds 1.8m/s,

3.1m/s and 4.4m/s (Subramanian et al., 2006).










G(s) (Eq. 6-6)


To determine the steering angle at different speeds, the steering angle at 4.4m/s was scaled

using equation (7). The scaling of the gain, G(s), for 4.4m/s produced the least error at different

speeds compared to the gains forl.8m/s and 3.1m/s..

Steering angle = Gain Steering angle at 4.4m/s (Eq. 6-7)


where gain = ( ).~ez~ ~ mi

Simulation

The developed fuzzy-Kalman filter algorithm was simulated in MATLAB to check the

functionality and accuracy. The data required for simulation was obtained by manually driving

the vehicle through a citrus grove alleyway and collecting all the sensor information. Figure 6

shows the simulated result of fusing the error obtained from vision and ladar algorithms. The

error is the distance between the center of the vehicle in the path and the path center. The result

is shown for a section of the path from 16.2m to 16.8m of travel for clarity. It is observed that

from 16.2m to 16.35m, only vision provides reasonable error and ladar does not. This may be

attributed to the fact that ladar sweep at that location might have been in a gap between two

consecutive trees. Hence, for the first 15cm distance, the fused data seems to have relied mainly

on vision. For the 15.35m to 15.55m section, error from both vision and ladar algorithms seem

reasonable. Hence, fusion takes both data into account. However, the fused data relies more on

ladar information as the process noise is less for ladar. For the last 25cm of the section from

16.55m to 16.8m, both vision and ladar determine the same amount of error.





















-16



Distance (m)
Figure 6-6. Simulation result of fusion of error obtained from vision and ladar algorithms

Experimental Procedure

Typical citrus grove alleyways are practically straight rows. It was felt that testing the

vehicle on curved paths at different speeds would be a more rigorous test of the guidance

system's performance than directly testing in straight grove alleyways. This test represents the

ability of the guidance system to navigate more difficult conditions than the ones likely to be

encountered in the grove. For this purpose, a flexible wall track was constructed using hay bales

to form an S-shaped test path. This shape provides both straight and curved conditions. The path

width varied from 3m to 4.5m and the total path length was approximately 53m. A gap of

approximately Im was left between successive hay bales. The gaps mimic missing trees in a

citrus grove alleyway. The profile of the hay bale track is shown in Figure 6-7. The hay bales

provide a physically measurable barrier, which aids in operating the ladar guidance system and

color contrast with grass which is useful for vision based guidance. In addition, when testing a

large tractor, minor losses would occur if the tractor runs over a barrier of hay bales.













3.5m 1m '





Figure 6-7. Hay bale track profile. A) Dimensions. B) Photo.

Experiments were conducted to check the robustness and accuracy of the fused guidance

system in the test path at 1.8m/s and 3.1m/s. The vehicle was manually navigated to the starting

position at one end of the test path. Once positioned, the vehicle was started in autonomous

control and allowed to navigate down the path to the other end thereby navigating a distance of

53m. Three repetitions were performed for each vehicle speed to ensure replication resulting in a

total of six trial runs. The path traveled was marked on the soil by a marking wheel attached to

the rear of the tractor. The error measurements, along the vehicle centerline, were taken

manually using a tape measure at regular intervals of Im. This interval was chosen to get

enough samples to correctly estimate the path error. Measurements at smaller intervals were

redundant.

After achieving satisfactory performance, wherein the vehicle stayed close to the center of

the path without hitting the hay bales, in the test path, tests were conducted in orange grove

alleyways at the Pine Acres experimental station of the University of Florida. The average

alleyway path width was approximately 3.5m and the length of the alleyway was approximately

110m with about 30 trees on either side. A typical alleyway is shown in Figure 6-8, where the

approximate tree height is 1.5m to 2m. Figure 6-9 shows the field of view of the camera and the

ladar when navigating the citrus grove alleyway.




















figure 6-5. uitrus grove alleyways


\FOV FOV j i -

`' 35



30m

A B
Figure 6-9. Field of View (FOV) of camera and ladar. A) Side view. B) Top/Front view.

Experiments were conducted at speeds of 1.8m/s and 3.1m/s along an alleyway. Three

repetitions runs were conducted for each speed. For measuring the error, markers were laid on

the ground at the path center at approximately 4m intervals along the alleyway. The test path

with harder turns required a sampling interval of Im whereas the grove alleyway was a relatively

straight path and therefore a sampling interval of 4m was sufficient. A rear facing camera was

mounted on top of the tractor cabin. The camera' s field of view included the area just behind the

rear of the tractor. Pixel to horizontal distance information was calibrated. While conducting the

experiments, the video of the ground behind the tractor was collected using this camera. The

error in traversing the alleyway was determined based on the position of the markers at different










instances in the video. The boundary of the path, for determining the path center in both test

track and in the grove, was manually done by visual approximation of the boundary.

Results and Discussion

From the experiments, RMS error, average error, maximum error and standard deviation of

error were calculated over the entire path by averaging the data over the three repetitions at each

speed. The path error, which is the deviation from the center line of the alleyway, was plotted

over the length of the test path for the different speeds. The positive and negative values of the

error in the plot are only an indication of the error being on the right side or left side of the path

center. For calculating the performance measures, the absolute values of the error were used.

Performance measures obtained from the experiments conducted in the test track are shown in

Table 6-3 and the corresponding path error is shown in Figure 6-10. Visually, the vehicle

navigated in the middle of the path as well as a human would. The average error in both cases is

less than 2cm from the path center. The measures are lower for the lower speed of 1.8 m/s than at

3.1m/s. The maximum error at both speeds is less than 4cm in the path width of at least 3m. This

is an error of only 1% of the path width. From Figure 10 A, it can be observed that most of the

time, the error is less than 2cm and only in few instances, the maximum error goes over 2cm.

The results of running vision and ladar based guidance separately in similar but not identical test

tracks are reported in Subramanian and Burks (2005). However, the path used in the present

research is more complex. At speeds of 3.1Im/s an average error of 2.5 cm was reported while

using ladar based guidance, whereas guidance based on fusion has an average error of 1.9 cm at

the same speed. The maximum error is also reduced from 5 cm reported for vision based

guidance at 3.1 m/s to 4 cm using fusion based guidance. Compared to those methods of

individual sensor guidance, navigation using sensor fusion resulted in better performance in

staying closer to the path center.









As directly observed from the simulations, the reduction of noise also significantly played

a role in the improvement of performance. This is in agreement with what is reported in Han et

al. (2002), that use of Kalman filter reduced the noise and improved tracking performance for a

DGPS based parallel tracking. Kobayashi et al. (1998) report high accuracy by using a fuzzy

logic Kalman filter based fusion method using DGPS, in accordance with the results reported in

this paper. Sasladek and Wong (1999) used fuzzy logic for tuning the Kalman filter with DGPS

receiver as the primary sensor and report that the simulation results showed prevention of

divergence and improved accuracy in position and velocity. Comparatively, this research used

vision and ladar as sensors and prevented divergence using fuzzy logic for tuning. In addition,

fuzzy logic was also used for picking the more accurate sensor and improved performance was

experimentally verified for a citrus grove application. Compared to many such similar research

reported in literature, this research has significant improvements and differences. Whereas many

researches use DGPS receiver as the primary sensor for their application, this research uses

vision and ladar as primary sensors for guidance as well as for fusion. Secondly, this research

uses fuzzy logic for tuning or divergence correction as well as for adaptively choosing what

sensor is to be relied upon for fusion at different locations in the path whereas most previous

research uses fuzzy logic exclusively for tuning or divergence detection. This method allows for

switching between important sensors useful in specific areas. Further, previous researches in

agriculture have mainly been for crop applications, whereas this research is for citrus grove

operations.










Table 6-3. Performance measures obtained from the experiments conducted in test track


`ir


~LY"4L/ ~iy ~21Y'- ~4~IC)Y ~/ \rV ~LL~.


1 6 11 16 21 26 31 36 41 46 51
Distance (m)


Lr1--~;Vlc-


1 6 11 16 21 26 :
Distance (m)


31 36 41 46 51


Figure 6-10. Path navigation error of the vehicle in the test track. A) 1.8m/s. B) 3.1m/s.

Performance measures obtained from the experiments conducted in the citrus grove

alleyway is shown in Table 6-4 and the error plot is shown in Figure 6-11. The average error at

the two speeds is less than 10cm and the maximum error is less than 22cm for an average










alleyway width of 400cm. This is about 6% of the alleyway width. From the error plot, it can be

observed that the error is less than 15cm most of the time. Based on the performance measures,

the guidance seems less accurate in the relatively straight grove than in the complex test path.

However, it should be noted that the path boundary conditions in the grove are much more

variable than in the test path. In the test track, the path boundary is relatively even without much

variation in the location between successive hay bales. In the citrus grove, the tree canopy size

varies significantly from one tree to the next, the variation being as low as 5 cm to as high as

50cm. The path boundary which is the tree canopy edge, used for error measurement, was

decided by the person conducting the experiment and it is not practical to accurately state a tree

canopy boundary. The larger errors in the citrus grove can be largely attributed to this boundary

variability rather than the navigation performance of the vehicle. These high errors are still

acceptable to navigate sufficiently in the middle of the alleyway without hitting any tree

branches. Visually, the navigation in the alleyway was as good as human driving and as good as

navigation in the test track. Overall, assuming a human driver as a reference of performance,

navigation in the citrus grove was as good as human driving with reasonable errors and confirms

the good performance of fusion based guidance.

Table 6-4. Performance measures obtained from the experiments conducted in grove alleya
Speed (m/s) Average Error Standard Maximum RMS Error
(cm) Deviation (cm) Error (cm) (cm)
1.8 7.6 4.1 18 8.6
3.1 9.4 4.4 22 10.3













25-

20-

15-

10-




-5 -
0 -


-15


-20-

-25-
4 20 36 52 68 84 100
Distance (m)

A

25-

20-



10-




-5 -


-10-



-20-

-25-
4 20 36 52 68 84 100
Distance (m)

B

Figure 6-11i. Path navigation error of the vehicle in the grove alleyway. A) 1.8m/s. B) 3.1m/s.


Conclusions


A sensor fusion system using fuzzy logic enhanced Kalman filter was designed to fuse the


information from machine vision, ladar, IMU and speedometer. Simulations were performed to


confirm the operation of the method. The method was then implemented on a commercial


tractor. The fusion based guidance system was first tested in custom designed test tracks. Tests


were conducted for two different speeds of 1.8m/s and 3.1m/s. An average error of 1.9cm at










3.1m/s and an average error of 1.5cm at 1.8m/s were observed in the tests. The maximum error

was not more than 4cm at both speeds. Tests were then conducted at the same speeds in a citrus

grove alleyway. The average error was less than 10cm for both speeds. The guidance system' s

ability to navigate was verified in citrus grove alleyways and was found to perform well with the

vehicle remaining close to the center of the alleyway at all times. The developed fusion based

guidance system was found to be more reliable than individual sensor based guidance systems

for navigating citrus grove alleyways. The present system is able to reliably traverse through an

alleyway of a citrus grove. Additional work is necessary before the system can be ready for

production. Some of the future work would include adding the ability to navigate the headlands

and the ability to optimally navigate the entire grove.









CHAPTER 7
HEADLAND TURNING MANEUVER OF AN AUTONOMOUS VEHICLE NAVIGATING A
CITRUS GROVE USING MACHINE VISION AND SWEEPING LADAR

Introduction

There is a need to automate grove operations in the Florida citrus industry and autonomous

vehicles form an integral part of this automation process. An autonomous vehicle can be used for

applications such as scouting, mowing, spraying, disease detection and harvesting. The

development of autonomous vehicle guidance systems using machine vision and laser radar for

navigating citrus grove alleyways was discussed in Subramanian et al. (2006). This paper

discusses the headland navigation algorithms.

Complete navigation of a citrus grove requires the vehicle to turn at headlands in addition

to navigating the alleyways. Headland in citrus groves is the space available at the end of each

tree row. This space is generally used for turning the vehicles navigating the rows. Headlands are

characterized by a ground space of at least 4.25m after the last tree in each row. The ground is

usually covered with grass. Beyond this space, there may be trees planted to act as boundaries of

the grove or there could be more open ground or some combination of both. An autonomous

vehicle operating in citrus groves needs to navigate the headlands to turn and navigate other

rows. Miller et al. (2004) have reported that that the space required to turn at the headland and

the time spent in turning, significantly affect the field efficiency. Hague et al. used vision based

guidance in combination with dead reckoning to guide a vehicle through crop rows. In this paper,

turning maneuvers to navigate the headlands of a citrus grove are discussed.

Materials and Methods

The vehicle used for developing the headland turning algorithms was the John Deere

eGator. The sensors used for developing the headland navigation algorithms are the camera, the

ladar, the IMU and the wheel encoder. The details of these sensors and their mounting location









were described earlier. The camera is mounted looking down at an angle of 10 degrees with the

horizontal and the ladar is mounted looking down at an angle of 15 degrees with the horizontal.

The camera, ladar and the IMU are interfaced with the 2.4GHz CPU running Windows operating

system. This is the same computer that was used on the tractor. All high level algorithms for

processing guidance sensor information and guidance algorithms are implemented in this

shoebox computer. The speed control and the steering control are implemented in the PC 104

computer described earlier. The steering control system used in the autonomous steering of the

tractor was the PID control, in which lateral position error of the vehicle was given as input to

the controller and steering was suitably controlled to minimize the error. In alleyway navigation,

the specification of the lateral position of the vehicle in the path can be described as error of the

vehicle from the path center. For headland navigation, there is no specified path that the vehicle

must take. Therefore it is not practical to specify a path error in all conditions. Also, the

development of classical controllers such as PID, a mathematical model of the system is useful

for tuning the controller parameters. For the tractor, this model was experimentally determined.

Development of a similar model for the eGator is time consuming. Therefore, a new type of

steering controller was explored which does not require error specification or mathematical

model, which was the Pure Pursuit steering.

Pure Pursuit Steering

Pure Pursuit steering mechanism has been used for many years in the Carnegie Mellon

NavLab autonomous vehicle proj ects and has been reported to be robust. The method is simple,

intuitive, easy to implement and robust. It is analogous to human driving in that humans look a

certain distance ahead of the vehicle and steer such that the vehicle would reach a point at the

look-ahead distance. The Pure Pursuit method is similar to the human way of steering. Pure

Pursuit is a method for geometrically calculating the arc necessary for getting a vehicle onto a










path (Coulter, 1992). Consider the illustration of the vehicle with the X-Y coordinate system

shown in the Figure 7-1.


Y (p, q)










I I
-I II


I II


Figure 7-1. Pure Pursuit steering method

Let the origin of the coordinate system be the center of the vehicle. Let (p, q) be a point

some distance ahead of the vehicle. Let an arc j oin the origin and the point (p, q). Let L be the

length of the cord of the arc connecting the origin to the point (p, q) and let R be the radius of

curvature of the arc. Let R be expressed as a sum of distance p and some distance D.

R= D +p => D= R -p (Eq. 7-1)

L, p and q form a right triangle. Using Pythagoras theorem for a right triangle

p2 q2= L2 (Eq. 7-2)

q, R and D for a right triangle. Using Pythagoras theorem,

q2 + D2 = R2 (Eq. 7-3)

=> q2 + (R-p)2 = R2

Simplifying this equation,

L2 p 2
R + (Eq. 7-4)
2p 2 2p










As this method is analogous to human driving, we can choose a predetermined look-ahead

distance 'p'. For this proj ect, the look-ahead distance was chosen to be 2.89m from the front of

the vehicle. For alleyway navigation, the error of the vehicle from the path center is 'q' at the

look-ahead distance p For non-alleyway navigation, 'q' can be the required lateral movement of

the vehicle at the look-ahead distance. This value of 'q' is obtained from the machine vision and

ladar algorithms for path segmentation. Using these values of 'p' and 'q', R can be calculated

from the above equation. R is the required radius of curvature of the vehicle to get the vehicle to

the required point. In the CAN based steering system of the e-Gator, steering has to be specified

as steering angles and not as radius. Therefore, R has to be converted to steering angle.

A calibration experiment was conducted to find a relationship between the radius of

curvature and the steering angle. In the experiment, the eGator was taken to an open area.

Steering angle was kept constant by sending CAN commands to the steering controller. The

vehicle was moved forward by manually stepping on the accelerator. For a constant steering

angle, the vehicle moved in a constant circle. Markers were placed on the ground at the inner

wheel radius at different locations. The diameter of the circle was manually measured using a

tape measure. The steering angle was varied to obtain different radius of curvature. The

illustration of the experiment is shown in the Figure 7-2 and the experimental results are shown

in the Table 7-1.





Vehicle






Figure 7-2. Illustration of experiment for steering angle calibration with radius of curvature










Table 7-1. Calibration of radius of curvature of vehicle with steering. angle
Steering Angle (Degrees)

10 15 20 25 30

Left 10.6 9.9 7.46 5.8 4.4
Radius
(m) Right 7.9 6.4 5.33 4.6 3.8




Experiments were conducted for steering angles from 10 degree to 30 degrees at 5 degree

intervals. The lowest angle chosen was 10 degrees because smaller angle had much larger radius

that caused the vehicle to travel out of the experimental area. The maximum angle of 30 degrees

is the physical limit of the vehicle's steering system. Experiments were conducted for both left

and right turns. The experimental values were plotted in the chart shown below.


+ Left
aRight
-Linear (Left)
Linear (Right)






y = -0 33x + 14 232


y = -0 2x + 9 606


15 20
Steering Angle (Degrees)


Figure 7-3. Calibration curve between radiuses of curvature of vehicle with the steering angle










From the chart, linear equations were fit to calibrate steering angle with radius of

curvature. The equations are:

Left tumn: Radius = -0.33* Steering angle + 14.232 (Eq. 7-5)

Right tumn: Radius = -0.2 Steering angle + 9.606 (Eq. 7-6)

Rearranging the terms,

Left tumn: Steering angle = -3.03* Radius 43.12 (Eq. 7-7)

Right tumn: Steering angle = -5 Radius 48.03 (Eq. 7-8)

Using these calibration equations, the steering angle required for various radius of

curvature was determined. The steering angles were sent to the steering controller. It is observed

that the minimum radius of curvature of the vehicle was 4m. This minimum value limits the

smallest turn that the vehicle can execute in the headlands. The flowchart of the Pure Pursuit

steering algorithm is shown in the Figure 7-4.


Obtain required vehicle
offset from vision or ladar




Use look-ahead distance and
offset to calculate radius of
curvature



Use calibration equations to determine
steering angle from radius of curvature




Send steering angle commands
to steering controller



Figure 7-4. Flowchart of Pure Pursuit steering algorithm









Headland Turning Maneuvers

As mentioned before, the amount of time spent and the area required by the vehicle to turn

at the headlands is important. A headland turning maneuver can be performed as a series of the

following steps: 1) Determining the approach of the end of current row while navigating the

alleyway. 2) Precisely establishing the end of the row. 3) Turning into the headland. 4) Turning

into the next row. 5) Getting to a position to navigate the next row.

Determining the approach of the headland

The first step of determining the approach of the headland while navigating the alleyway is

performed using machine vision. Consider the Figure 7-5 which shows the image obtained from

the camera when the vehicle was in the middle of the alleyway (a) and the image obtained from

the camera when the vehicle was close to the end of the alleyway (b),(c).















Cs







Figure 7-5. Images obtained from the camera when the vehicle is navigating the alleyway









The segmentation algorithm used originally for path determination for headland

navigation, segments the entire image for trees and pathway. When the segmentation algorithm

is run on the above images, the resulting images are as shown below.












A B















Figure 7-6. Segmented images

It is observed from the segmented images that when the vehicle is close to the end of the

path, the upper half of the segmented image is predominantly either white or black depending on

whether trees are absent or present in the headland. For the case when the vehicle is in the

middle of the alleyway, the upper half of the image is both black and white without any color

being predominant. This difference in the upper half of the image for close to headland cases and

middle of alleyway cases is used in the vision algorithm.









The headland determination vision algorithm uses two fixed boxes, large/green and

small/red as shown in the Figure 7-7.


Figure 7-7. Vision algorithm example for headland detection

The two lines, j oining the lower corners of the image to the lower corners of the lower box,

indicate the path boundaries as obtained from path segmentation algorithm. For the case in the

middle of the alleyway, the smaller box falls on the pathway whereas the bigger box

encompasses random obj ects including the headland (A of Figure 7-7). When the vehicle is close

to the headland, the two boxes fall on headland objects (B and C of Figure 7-8). As observed in

the images, the headland often consists of either trees or open space. The use of the boxes is

shown in the segmented images below.


A ,

























Y ~C






Figure 7-8. Use of boxes in the segmented images

Close to headland, the pixels in the two boxes are both classified as either trees or pathway

whereas in the middle of the alleyway case, the two boxes are classified differently. Based on

this difference in the boxes, the algorithm determines the approach of the headland.

By trial and error, it was found that using this vision based method of detecting the

approach to the headland, detection of the headland is always determined when the vehicle is at a

distance of approximately 29m to 31Im from the headland. As observed this distance is not

accurate and the variation is from row to row. This is however a good estimate of the distance to

the headland. Once the approach of the headland has been detected by the vision algorithm, this

distance is used to dead reckon the vehicle through rest of the alleyway till the headland has been

reached. A flag is sent to the PC104 computer indicating that the headland has been detected. In

the PC104 computer, dead reckoning is performed using the information from the encoder. The

equation for the dead reckoning process is given as

Dc = Dc + de (Eq. 7-9)









where Dc is the current distance and de is the change in the distance measured by the

encoder.

Also, when the headland is detected using the machine vision algorithm, the priority of

ladar in the sensor fusion process is increased and the priority for vision is reduced to zero. This

is because, close to the headland, the image from the camera primarily consists of headland

obj ects. This is also due to the long field of view of the camera. The vision algorithm

development was based on segmenting trees and path. Close to headlands, this segmentation is

not feasible as the alleyway trees are out of the camera' s field of view. On the other hand, ladar

with the short field of view is able to detect the last few trees in the alleyway and perform the

navigation. Therefore, navigation is switched entirely to ladar.

Precisely establishing the end of the row

As mentioned earlier, the vision based headland detection gives an estimate of where the

headland is located and the dead reckoning allows the vehicle to get close to the headland. A

midsize citrus tree with some leafy canopy has a width of a minimum of 2m. The estimated

distance using vision followed by dead reckoning allows the vehicle to get to a position such that

the last tree is to the side of the vehicle. Once the dead reckoning is completed and the vehicle

has reached the estimated distance near the last tree, the vehicle is moved further forward while

the ladar data is monitored. This progress is continued until the ladar algorithm does not detect

any trees on either side. When this happens, the vehicle is assumed to have crossed the last tree.

This ladar monitoring ensures that the vehicle crosses the last tree of the alleyway.

Drawback of the vision based headland detection

In the vision based headland approach detection algorithm described earlier, the

assumption was that the headland consists either of uniform headland trees or open space. The









caveat with the vision based headland detection is that the headland is not always composed

entirely of trees or open space. A sample image is shown in the Figure 7-9.


















Figure 7-9. Sample image where the vision based headland detection can fail

In the above image, a single tree is present in the headland along with an artificial obj ect.

In such a case, the vision based algorithm fails to precisely detect the headland. Presence of

random obj ects makes the vision algorithm less accurate at determining the distance to the

headland. In the previously described method, the vision based headland detection with dead

reckoning helped the vehicle reach the last tree of the alleyway. With the failure modes of the

vision based detection, the estimate of the headland may not be enough to reach the last tree or

may take the vehicle beyond the last tree. To overcome this problem, a ladar based algorithm is

described as follows.

Precisely establishing end of the row using sweeping ladar

The currently used mount angle of the ladar is such that it is able to detect only the closest

tree to the vehicle. It was proposed that if the ladar is rotated through a range of angles, many

more trees could be detected in front of the vehicle. The precise distance measuring capability of

the ladar would help in establishing the end of the row. In this method, a digital servo motor










(HiTec HSR-5995TG) was custom attached with the ladar. A custom designed power supply

regulator was built to power the motor. A PIC microcontroller based serial communication

interface was also custom built to interface with the motor. The motor-ladar mount is shown in

the Figure 7-10.



















A B
Figure 7-10. Ladar and motor assembly for sweeping. A) view 1. B) view 2.

The motor was attached to the ladar body such that when the motor shaft rotated, the ladar

was oscillated up and down about the horizontal plane. The ladar was swept from an angle of

15deg. above the horizontal to 30deg. below the horizontal at a speed of approx. 15deg/s (Figure

7-11). The range of angles was chosen such that the ladar sweep covered only the tree locations.

Angles beyond 15 degrees above the horizontal resulted in the ladar sweeping over the trees and

the angles beyond 30 degrees below the horizontal resulted in the ladar sweeping the ground.

The angles were also limited by the range of the mechanism linking the motor to the ladar. The

speed of rotation was a limit of the servo motor.



























Figure 7-1 1. Illustration of the ladar sweep

As mentioned earlier, the ladar refresh rate is approximately 38Hz. From the motor sweep

speed and the ladar refresh rate, data available per degree is approximately

38
2.53 = 2 full sweeps per degree


Therefore, a ladar scan is obtained every half a degree at this sweep rate of the motor. In

this sweeping ladar algorithm, the motor is allowed to execute one sweep of the ladar from up to

down. The entire data obtained from the ladar during this period is stored in an array. Using the

calculations mentioned above, the total number of scans of the ladar is then matched with 0.5

degree positions of the sweep angle. One sweep of the ladar by the motor at 15deg/s takes

approximately 5s. Synchronization of the ladar and sweep angle consumes a longer time. That is,

if the motor was steeped to move to 0.5 degree increments and then allowed to acquire a full

ladar scan at each angle, the time consumed to execute one full sweep would be much higher.

The overall allocation of scans to sweep angles eliminates the need to synchronize the sweep

angle positions with the ladar scan and thereby saves time. Also, it was experimentally observed










that the error introduced due to 0.5 degree allocation of scans is not significant to warrant

synchronization.

As mentioned in Chapter 3, the data obtained from the ladar is in polar coordinates with

the ladar being the origin. For this algorithm, it is intuitive to use data in Cartesian coordinates.

Consider the coordinate system shown in the Figure 7-12.


(R, m,n)







m y








Figure 7-12. Coordinate systems

Let P be a data point at a radius R from the origin subtending angles m and n with the z

and y axis respectively. To convert this to the Cartesian coordinate system, the following

equations were used.


Px = -R Cos(-) Cos(m) (Eq. 7-10)



Py = R Sin( ) Cos(m) (Eq. 7-11)


Pz = R Sin(-) Sin(m) (Eq. 7-12)


(Px, Py, Pz) is the data point in the Cartesian coordinate system. The data obtained by the

scan is then filtered to keep the relevant data. It is known that the lateral distance of trees in the










row, from the ladar can range from a distance of 0.5m to a maximum of 5m. The ladar to the

vehicle's end is 0.5m. So, a tree limb is not expected to be closer than 0.5m during normal

operation. Distances longer than 5m correspond to trees located in adj acent rows. Therefore, the

filter was implemented such that any data beyond 5m and data less than 0.5m were removed.

After filtering, a sample plot of the data in the Cartesian coordinates is shown in the Figure 7-13.


Figure 7-13. Swept ladar data in Cartesian coordinates

It is observed from the figure that the data corresponding to different trees are clustered

together. It is also observed that only four consecutive trees in front of the vehicle can be

detected with confidence. Although the range of the ladar is 80m, the trees further away

contribute very few data points to be clearly detected.





-500 -



-40 -- -- -








-100IIIIII
II ~ ~~ 50 0010 20 5030










3234


An algorithm was developed to determine the tree locations from the data. The Figure 7-13

shows the data in 3-dimensional space. The search space for the algorithm, when analyzing data

in 3-dimensions is large. This results in the consumption of a large amount of computer

processing time. For real-time precise headland detection, this is significant. The aim is to locate

the trees with respect to the vehicle in the x-y plane. The location of the data in the vertical plane

is not important. Without loss of useful information, the 3-dimensional data is proj ected on to the

horizontal plane. The proj ected data is shown in the Figure 7-14.










* If the window in a location consists of more than 5 data points, note the location of the
window

* The location indicates the presence of a tree or a part of a tree

If two consecutive tree containing locations differ by more than 250cm, the difference is

due to the presence of headland. It is then determined that the headland starts just after the last

tree location prior to this gap. Based on the location of the trees in the alleyway, the exact

location of the headland is determined. The algorithm and clustering ensure that the headland is

found precisely and the determination is accurate to less than 50cm beyond the last tree in the

alleyway. The result of clustering and headland determination is shown in the Figure 7-15.














Figure 7-15. Tree clustering and headland determination

Vision and ladar based headland determination

The vision based headland detection and sweeping ladar based establishment of headland

were described above. As mentioned, the sweeping ladar process takes approximately 5s to

execute a sweep. It is not practical to execute the sweep while navigating the entire row.

Therefore, sweep is started only when the vision based algorithm detects the headland. Then the

sweeping ladar algorithm takes over. The dead reckoning is no longer required. On further tests

conducted in the grove using the above algorithms, it was also found that a stationary ladar

mounted at an angle of 10 degree to the horizontal, is able to detect four trees similar to the










sweeping ladar. This mounting angle increases the operating range of the ladar. This angle

however, may not be suitable for close range obstacle detection. The stationary mount of the

ladar also obtains less data compared to the sweeping ladar. This however does not prevent the

precise determination of the headland and is as accurate as the sweeping ladar. The sweeping

ladar is still a useful development for future operations in this proj ect. Once the headland

location is established, the vehicle speed is reduced to headland navigation speed.

Navigating the headland and entering next row

From the location of the headland determined by the vision algorithm and the exact

location determined by the sweeping ladar, the turning maneuvers are started when the last tree

in the alleyway has been passed. Two turning maneuvers namely the u-turn and the switch back

turn were developed. The determination of which maneuver is to be executed is made using the

sweeping ladar algorithm. If an open space of more than 3m is available in the headland, the

vehicle proceeds to do a u-tumn and switch back tumn otherwise. For the turning maneuvers,

algorithms for steering similar to human operations were developed. For the development of the

turning algorithms, the distance between the left and right tree stems in a row is assumed to be

approximately 6m. The U-turn maneuver is implemented as follows:

1. When the vehicle has crossed the last tree, note the heading of the vehicle using the yaw
angle from the IMU.

2. Tumn the steering completely to one end such that the vehicle moves opposite to the direction
of the required tumn. This step is to make a wide u-turn.

3. Move forward for approximately 0.5m.

4. Tumn the steering completely to the other end such that the vehicle turns in the required
direction. Allow the vehicle to move forward.

5. Monitor the heading angle. Monitor for obstacles in front of the vehicle using the ladar. If an
obstacle is detected, stop the vehicle.










6. When the vehicle has turned by more than 150 degrees, switch to alleyway navigation mode
using vision and ladar.

7. The vehicle has now just entered the new row. Any error in entering the new row is corrected
using the alleyway navigation algorithm.

The switch-back turning maneuver is implemented as follows:

1. When the vehicle has crossed the last alleyway tree, note the heading of the vehicle using the
yaw angle from the IMU.

2. Move forward a distance of 0.5m. This is done to avoid colliding with any long branches of
the last alleyway tree.

3. Tumn the steering wheel to one end such that vehicle turns in the required direction.

4. Monitor the vehicle's heading and monitor for ob stacles in front of the vehicle using the
ladar.

5. When the vehicle has turned by an angle of 90degrees with respect to the initial heading
angle before turning, stop the forward motion of the vehicle.

6. Tumn the steering to center position and reverse the vehicle a distance of 0.5m.

7. Tumn the steering to the end such that the vehicle turns in the required direction.

8. When the vehicle has turned by more than 150 degrees with respect to the initial heading
angle before turning, switch the algorithm to alleyway navigation mode using vision and
ladar.

9. Any lateral error in entering the new row is corrected using the alleyway navigation
algorithm.

It is to be noted that the turning maneuver algorithms are tuned for a grove with certain

dimensions of alleyway width. It is operable in groves with dimensions that are different by

approximately Im. But the algorithm is not suitable for groves that differ more. It is difficult to

navigate headlands using vision and ladar as headlands can consist of random obj ects and the

vision and ladar could not be modified for all possibilities. During tests in a grove in Immokalee,

Florida, it was found that the grove includes a large drainage pit on the ground at the end of each

row. In such a case, the turning maneuvers failed as the vehicle ends in the pit. Therefore, more

complex ladar based algorithms may be necessary to adapt the vehicle to such possibilities.










Experimental Procedures


Experiments in the Test Track

All previous experiments with the autonomous vehicle guidance system were conducted

using the tractor as the vehicle. In this stage of the proj ect, the guidance system has been adapted

for use with a John Deere eGator as the vehicle. The performance of the guidance system on the

eGator had to be first tested. Therefore, it was decided that guidance performance tests would be

conducted in a test track before testing in an actual grove. A test track was constructed in an

open area near Lake Alice in the University of Florida campus. The track consisted of boundaries

on either side of a path. The path boundaries were constructed using black flexible plastic

cylindrical containers. The test track is shown in the Figure 7-16, with the eGator positioned in

the path. Gaps were incorporated in the path boundaries to mimic gaps between citrus trees.














Figure 7-16. Vehicle in the test track

Although this test track does not ideally mimic the grove conditions, it is a way to ensure

reasonable path navigation ability of the guidance system on the vehicle prior to testing in the

grove. If testing was directly conducted in a citrus grove and the guidance system fails, damage

to trees could occur. Testing in test tracks ensures reasonable operation of the guidance system

and minimizes the probability of accidents in the grove. The boundaries of the test path are

relatively inexpensive compared to trees. The choice of black obj ects for the boundary is to aid










the vision algorithm. The threshold value used in the vision algorithm had to be modified to

operate in the test track. The ladar algorithm also was modified to detect the test track

boundaries. The pure pursuit steering control algorithm however did not require any change as it

does not depend on the path. However, these modifications were minimal and not complex. The

profile of the test track is shown in the Figure 7-17.






,-'m 15m 1







Figure 7-17. Profie of the test track for testing the guidance system

The length of the path was approximately 53m with an average path width of

approximately 2.5m. The profile of the path was chosen such that the vehicle would have to

make both a left turn and a right turn to completely navigate the path. This ensures that both left

and right turn ability are operational. For the experiments, the vehicle was started in autonomous

mode at one end of the path, positioned near the center of the path. The vehicle was manually

driven to the starting location, resulting in different initial offset and heading angle for each

repetition. Fusion based guidance was used to navigate the vehicle. Experiments were conducted

at a vehicle speed of 3.1m/s. The vehicle was allowed to navigate the entire length of the path

autonomously and after reaching the end of the path, the vehicle was stopped manually. Three

repetitions of the test runs were conducted. Lines were drawn in the path using lime powder.

Every time the vehicle navigated the path, the wheel marks were marked on the lines. The









vehicle's location in the path was detected based on these marks. The path error of the vehicle

was measured using a tape measure based on the location of these marks. The measurements

were taken at regular intervals of about 3m. Measurement at more frequent intervals was

expected to be redundant. After ensuring good performance of the guidance system in navigating

the test track, the headland turning experiments were conducted in the citrus grove. The tests in

the test track ensured the proper adaptation of the guidance system to the eGator.

Experiments in the Citrus Grove

The turning maneuvers were tested in a citrus grove in the Pine Acres Experimental

Station of the University of Florida in Citra, Florida. Experiments were first conducted to

confirm the alleyway navigation performance of the eGator based guidance system. As with the

test track, the vehicle was run in autonomous mode in an alleyway of the citrus grove. Lines

were marked with lime and the tire marks of the eGator on the lines were used measure the path

navigation error. Three repetitions were performed for the test run. The speed of navigation of

the vehicle was 3.1m/s. After ensuring good performance of the guidance system in navigating

the alleyway, experiments were conducted to test the performance of the headland turning

maneuvers. The path width of the alleyways of the citrus grove was about 3.5m and the alleyway

length was approximately 110m with about 30 trees on either side. The average tree height was

approximately 2m. Figure 7-18 shows one such grove alleyway.












Figure 7-18. Alleyway of the citrus grove where experiments were conducted










In the grove, alleyways were chosen for conducting the headland turning experiments. The

alleyways were chosen such that the last set of trees in the rows had good canopy which could be

detected using the ladar. In a chosen alleyway, the vehicle was started approximately in the

middle of the alleyway in autonomous mode. The type of turn, i.e. u-turn or switch back turn, to

be executed at the end of the alleyway and the direction of turn, i.e. right turn or left, were

specified in the program (Figure 7-19).

Headland Trees
Al eyHT dland VeilePt
3/ 1;27 Headland
Vehicle Path
2 8 Alley Trees










A B

Figure 7-19. Headland turning maneuvers in the grove. A) u-turn. B) switch-back turn.

The vehicle was allowed to navigate the rest of the alleyway using sensor fusion based steering,

detect the headland using the ladar, execute the specified turning maneuver in the headland using

the algorithm described above, get into the next row and start navigating the next row. The

vehicle was manually stopped after navigating a distance of approximately 20m in the new row.

The path that the vehicle took in the headland was marked using markers. The markers were

placed manually. The markers were placed at equal angular intervals of 22.5 degrees with the

tree stump as the center. This resulted in a total of 9 markers being placed, representing the path

taken by the vehicle in the headland. The illustration of the marker locations are numbered in the

Figure 7-19. Two additional markers were used for the switch-back turn to mark the extreme










forward and backward location of the vehicle before and after reversing respectively. The

vehicle' s average speed was 1.8m/s while executing the turns. This speed was chosen as a

reasonable speed to safely navigate the headland and is visually similar to the speed at which a

human would drive the vehicle. Experiments were conducted for both left and right turns for

each maneuver. Therefore, four maneuvers were performed, namely the left u-turn, right u-turn,

left switch back turn and right switch back turn. Three repetitions were performed for each of the

runs. After the execution of each test run, the distance of the markers from the trunk of the tree,

about which the turn was executed, was measured. This measurement was taken manually using

a tape measure. From this measurement and known vehicle width, the distance of the vehicle' s

inner wheel, in the headland, from the same tree stump was calculated.

Results and Discussion

For the path navigation experiments conducted in the test track and the grove alleyway, the

average error, the rms error, standard deviation of the error and the maximum error were

calculated and are presented in the Table 7-2.

Table 7-2. Performance measures of the eGator based guidance system for path navigation
Path IAvg. Error (cm)l Std. Dev (cm) Max Error (cm)l RMS Error (cm)

Test Track 1.8 1.2 3.5 2

Grove 5 3.5 9 8
Alleya


Detailed discussion of the fusion based guidance system was presented in Chapter 3. As

the path navigation experiments presented here were a confirmation of the adaptation of the

guidance system to the new vehicle, only the results are presented. It is observed that the

performance of the guidance system using eGator in navigating the paths is similar to the

performance when using the tractor.










From the experiments conducted for the headland turning maneuver, the turns were

executed around the last tree in the alleyway. The size of the headland is variable and depends on

the boundary of the grove. Therefore, it is not practical to specify error measure for headland

turning maneuvers. Miller et al. (2004) have reported that that the space required to turn at the

headland and the time spent in turning, significantly affect the Hield efficiency. If more time is

spent in tuning in the headlands, the overall time for navigating the grove is increased and this

reduces the efficiency of operation. The amount of space required for turning in the headland is

dependent on the amount of headland space available. Requirement of more space for turning

implies wastage of useful space where more trees could have been planted. However, if the

configuration of the grove is such that more space is available, then this space could be used for

a faster turning maneuver. The amount of headland space for turning can be related to the

distance of the vehicle from the last tree in the alleyway while navigating the headland.

Therefore, the average, maximum and minimum distance of the vehicle from the closest

alleyway tree and the average time required to execute the turn are reported. The minimum and

maximum distances are illustrated in the Figure 7-20.




Headland












A B

Figure 7-20. Headland turning maneuver performance measures










The measurements were measured and averaged over the three repetitions for each

maneuver. The results are given in the tables 7-3 and 7-4.

Table 7-3. U-tumn prormance measures
Tumn Avg. Dist. (m) Max Dist. (m) Min Dist. (m) Avg. Time (s)
Left 3.6 5.2 1.9

Right 3.8 5.3 1.8


Table 7-4. Switch back turn prormance measures
Tumn Avg. Dist. (m) Max Dist. (m) Min Dist. (m) Avg. Time (s)
Left 2.7 3.2 2

Right 2.7 3.6 1.9



From the results it can be observed that the average distance of the vehicle' s inner wheel

from the tree for u-tumn is around 3.7m and for switch-back tumn is 2.7m. It can be inferred that

less radial distance is required from the tree stump to execute a switch back turn than a u-turn.

This result is as expected intuitively. The switch back tumn maneuver helps to execute the

headland navigation within a tighter area than the u-turn. The performance of the vehicle in

executing the turn is dependent on several factors such as the minimum radius of curvature of the

vehicle steering, the position of the vehicle when exiting the previous row and when entering

next row, the operating speed of the vehicle, the canopy size of the trees closest to the headland

and the surface gradient of the headland. This difference in radial distance can be attributed to

the minimum radius of curvature of the vehicle. As a result of the larger radius of curvature

compared to the radius of the turn, the vehicle has to take a wider u-turn to successfully complete

the maneuver. This cause can also be attributed for the maximum radial distance of the vehicle,

which is higher for the u-turn, which is around 5.3m for the u-turn compared to 3.6m for the

switch-back turn. It is also observed that the minimum distance for both turning maneuvers is










similar at around 2m. This is because, the minimum distance occurred at the location when the

vehicle just exits the alleyway and begins to execute the headland maneuvers. For both the

maneuvers, the vehicle exited the alleyway at approximately 2m distance from the last tree

stump. The average time for executing the u-tumn is 14s and for the switch-back turn is 17s at an

average speed of 1.8m/s. For practical comparisons, the times are similar for both the maneuvers.

The time taken to cover a longer distance by the wide u-turn is similar to the time taken in

reversing by the switch-back turn. The slightly larger time for the switch-back tumn may be

attributed to the time taken by the vehicle in stopping and then reversing from rest. Visually, the

headland turning maneuvers were similar to human driving.

Conclusions

Headland turning maneuver algorithms were developed for an autonomous vehicle

navigating a citrus grove to execute at the headlands. Algorithms using machine vision and

sweeping ladar were developed to precisely detect the headland while the vehicle navigates the

alleyways. When the headlands were detected and reached by the autonomous vehicle, U-tumn

and switch-back turning maneuvers were developed to navigate grove headlands. The

performance of the headland detection algorithm and the execution of the turning maneuvers

were tested in a citrus grove. Experiments were conducted to test the performance of the turning

maneuvers. The average distance from the last alleyway tree, while executing the u-turn was

3.8m and while executing the switch-back turn was 2.7m. Switch back turning maneuver was

able to execute the turn within a smaller distance compared to the u-turn. However the average

time required to execute the u-turn was 14s and for executing the switch-back turn was 17s. With

the addition of the headland detection algorithms and the turning maneuvers to an autonomous

vehicle already capable of navigating the alleyways, the autonomous vehicle now possesses the

ability to navigate an entire grove.









CHAPTER 8
OPEN FIELD NAVIGATION AND OBSTACLE DETECTION

Introduction

In the previous chapters, methods were described that allowed the autonomous vehicle to

accurately navigate the alleyways of a citrus grove, detect headlands and turn at the headlands

using u-turn and switch-back turn maneuvers and navigate subsequent alleyways. With these

capabilities, the autonomous vehicle possesses the ability to navigate an entire grove

autonomously. In large citrus producing farms, the groves can be arranged in the form of blocks

separated by open fields or roadways. With the previously described capabilities, after the

vehicle completes navigating a grove block, the vehicle has to be manually driven to another

grove block. The vehicle would then autonomously navigate the new grove block. If the vehicle

could autonomously navigate the area between grove blocks, the entire farm could be covered

without human intervention.

When the vehicle navigates the citrus grove alleyways or open fields, obstacles such as

animals, humans or broken branches can be expected. When obstacles are present, the vehicle

must be capable of detecting them and stopping the vehicle. If possible, the vehicle should also

navigate around the obstacle. Within the grove alleyway, there is not much space to navigate

around obstacles, whereas it might be possible in the open fields. The development of obstacle

avoidance capability is beyond the scope of this proj ect. This chapter discusses the development

of open field navigation capability to navigate the open field between grove blocks and the

development of obstacle detection capability.












Open Field Navigation

Open field refers to the open area available between grove blocks. The open field is

expected to contain uneven terrain, random placement of trees and fences, and presence of

obj ects such as cows. The vision based navigation and ladar based navigation methods described

earlier were developed such that they could segment trees from the ground and navigate between

trees. In open fields, there is no fixed placement of obj ects. The random nature of obj ects in open

fields or lack of objects makes it difficult to develop vision or ladar based algorithms for

navigating open fields and reach a different grove block. A global solution is required to solve

this problem. GPS is an excellent global method for navigating open fields. Therefore, GPS

based navigation for traversing open fields is proposed.

A differential GPS (DGPS) receiver was available in the laboratory from a previous

research on determining the tractor dynamics and is described in Chapter 3. GPS based

navigation has been widely used by several researches for guiding autonomous vehicles through

crop rows. The DGPS receiver available has a manufacturer specified accuracy of approximately

12inches. The receiver is also capable of operation at 5Hz with a DGPS fix i.e. enough satellite

signals and the local station correction signal is available for high accuracy operation. For

navigation based on GPS, the path that the vehicle must take should be specified as way points

apriori. Using these points, the vehicle, with the aid of the GPS, can follow the points to reach

the next grove block. The way points for this purpose would be the latitude and longitude

coordinates of way points obtained using the DGPS receiver. When operating autonomously, the

guidance system must be capable of using the latitude and longitude information to steer the

vehicle to the required way point.


Materials and Methods









The important component of the algorithm consists of determining the steering angle of

the vehicle to get the vehicle from the current location to a required location. The capability to

steer to consecutive points can be implemented in the software. Consider the vehicle located at a

point A with latitude 'latA' and longitude 'lonA'. Similarly, consider a way point B with

coordinates 'latB' and 'lonB'. Suppose the vehicle is required to travel to B from A, it is

necessary to determine the distance between the locations and the direction of location B with

respect to location A. The direction can be specified with the geographic north-south direction as

reference. To determine these, a navigation formula called Haversine formula, was used. The

Haversine formula is an equation to determine great-circle distance between two points on a

sphere. It is an adaptation of the more general spherical trigonometric formulas. To use the

Haversine formula, considering the earth as a sphere, assumptions are important to obtain results

with good accuracy. For example, when the points are situated at a large distance compared to

the radius of curvature of the earth, the formula considering the earth as a sphere gives good

accuracy; when the distance between the points is significantly small compared to the radius of

the earth, the assumption of a flat earth can give accurate results and reduce computation; when

the points are located on either side of hills or mountains, modifications have to be made to

obtain accurate results. For this proj ect, it is assumed that the way points would be located at a

much smaller distance compared to the radius of the earth and no significant elevation different

is present between the points. Therefore, the Haversine formula, modified for short distances was

used. The Figure 8-1 shows the vehicle at A and a way point B where the vehicle must reach.










N Way point
*B





Theta



Vehicle \


Figure 8-1. Illustration of vehicle navigating way points

The vehicle is heading in a direction towards a random point C with a heading angle m

with respect to the geographic north. The straight line distance required to reach B is 'd' and a

turn through an angle 'theta+m' is required to reach B. The Haversine formula gives d and theta



d =Rp (Eq. 8-1)

where R = radius of curvature of earth = 6378140m


p = 2 Tan' ( ~, J-) (Eq. 8-2)

dlat dlon
a = Sin2( ) + COs(latA) Cos(latB) Sin2( )(Eq. 8-3)
2 2

dlat = latB latA (Eq. 8-4)

dlon = lonB lonA (Eq. 8-5)

theta = mod [atan2 { Sinrdlon) Cnos/latB), Cos/latA) Sin/latB) Sin/latA) CoslatB)


Cos(dlon)), 2 xi ] (Eq. 8-6)

theta is positive for directions east of the geographic north and increases anti-clockwise

from 0 to 2 xi in radians. The direction 'theta' is the heading from location A to location B. To

steer the vehicle, it is also necessary to determine the current heading of the vehicle with respect










to the geographic north. This information was obtained from the IMU mounted on the vehicle for

other algorithms. The IMU gives the heading of the vehicle with respect to the geographic north

as 'm'.

The vehicle should be steered such that the angle 'm' is the same as the angle 'theta'. It is

to be noted that the DGPS receiver is operating at a rate of 5Hz and the IMU is operating at a

rate of approximately 40Hz. An assumption in developing the algorithm is that the vehicle is

operating at a speed of less than 4m/s. This speed limitation is based on the 5Hz refresh rate of

the DGPS receiver and was determined experimentally by trial and error. As the vehicle is

steered in the proper direction, the distance between A and B also reduces until the vehicle

reaches the location B.

The steering angle obtained by the above formulas is in radians. In the development of the

pure pursuit steering described in the Chapter 3, a protocol was used such that the PC sent the

error of the vehicle in pixel coordinates to the PC104 computer through serial communication

and the error was converted to steering angle by the pure pursuit algorithm in the PC104

computer. Therefore, to use the same protocol, the steering angle obtained above has to be

converted to pixel coordinates before sending to the Pure Pursuit steering controller. The

physical range of the steering angles of the vehicle is approximately -30 degrees to +30 degree

from left to right respectively with 0 degree as the center.

From the above equations, the steering angle 'theta+m' has a range of 0 to~r It should be

noted that the mathematical calculations can provide a range of 0 to 2 xi Considering the

examples that a turn of 2 xi is the same as 0 and for example a turn of 1.5 xi is the same as turn of

0.5 x in the opposite direction, the effective range can be reduced to 0 to ai for steering. In










degrees, this is the same as 0 to 180 degrees. This can also be expressed as -90 degree to +90

degree.

Comparing the vehicle's physical steering angle range and the required steering angle

range, the conversion is from 0 to 180 degrees with 90 as center to 0 to 60 degrees with 30 as

center. This is illustrated in the Figure 8-2.









-30~ s +30


-90~---- ~ Turnof vehicle +90







Figure 8-2. Angle ranges of steering and vehicle turn

A linear relationship is formed between the ranges such that -30 to +30 degrees of the

required vehicle turn is mapped to -30 degrees to +30 degrees of the physical steering angle of

the vehicle. If this one to one relationship is made, the range from -3 1 degrees to -90 degrees if

the vehicle' s turn and +3 1 degrees to +90 degrees of the vehicle' s turn should also be mapped.

The range of -90 degree to -31Idegree of the vehicle turn is mapped to a single value of -30

degree of steering angle and the range +31Idegree to +90degree of the vehicle turn is mapped to a

single value of +30 degrees of steering angle. This can be observed in the Figure 8-2.










From the previous sections, it is known that the image width used for vision is 400 pixels.

To maintain the communication protocol with the PC104 computer algorithm, the range of -30

degrees to +30 degrees is mapped to 0 to 400 pixels of image coordinates.

Pixel value = 400 (steering angle/60) (Eq. 8-7)

This pixel value of the required steering is sent to the Pure Pursuit steering control.

As described in the algorithm above, when suitable steering is executed, the distance

between the vehicle and the way point reduces and the vehicle reaches the way point. It is known

that the accuracy of the DGPS receiver is 12 inches. During initial developments of the

algorithm, the vehicle was determined to have reached the way point, when the distance between

the vehicle and the way point was zero. It was observed that the uncertainties in the accuracy of

the receiver, the terrain variation and small inaccuracies in steering often result in the vehicle

reaching a point very close to the way point of the order of centimeters away from the way point.

In such cases, if the algorithm requires that the distance be zero, to translate the vehicle by a few

centimeters to the way point, the vehicle has to execute circles or spirals to get to the way point.

This is illustrated in the Figure 8-3.



Path






SWay Poi
Vehicle







Figure 8-3. Path taken for translation of the vehicle to the way point









An accuracy of exactly reaching the way point is not practical in the physical system.

Therefore, by trial and error in tests, it was chosen that if the vehicle reaches a point within a

distance of 50cm from the way point, the way point is determined to have been reached by the

vehicle.

Obstacle Detection

The obstacle detection capability should allow the vehicle to detect obstacles such as

humans, animals and location variable obj ects such as vehicles, while navigating open fields. For

the scope of this dissertation, a simple obstacle detection method using ladar was developed. It is

assumed that the way points for GPS based navigation were chosen such that the vehicle would

not be expected to get tree limbs or fixed obj ects such as thin poles in its path.

For the development of the algorithm for detecting obstacles, the first step was to analyze

the data from the ladar while navigating the open field. As mentioned earlier, the ladar scans a

plane from 0 degree to 180 degrees and obtains radial distance to obj ects in front of the ladar.

The radial data is converted to Cartesian coordinates using the geometrical formula

Frontal distance = (radial distance)(Sin (scan angle)) (Eq. 8-8)

Lateral distance = (radial distance)(Cos (scan angle)) (Eq. 8-9)

The same formula was also used in the algorithm for ladar based grove alleyway

navigation. The Figure 8-4 shows the plot of the frontal distance obtained from the ladar against

the scan angle when there is no obstacle in front of the vehicle.

From the chart, it is seen that the closest distance in front of the vehicle at 90 degree angle

corresponds to the ground directly in front of the vehicle. The ground corresponds to a distance

of approximately 6m frontal distance from the ladar. For scan angles away from the central angle

of 90 degrees, the distance corresponds to ground locations away from the center of the vehicle.






































































































0O


0 Ground


Obsacl






0 20 4 0 8 0 20 10 10 10 2


020 40 60 80 100 120 140 160 180 200
Scan Angle (deg.)


Figure 8-4. Ladar data when obstacle is not present in front of the vehicle


The Figure 8-5 shows the scan data from the ladar, when an obstacle is present



directly in front of the vehicle. From the data, it is observed that for most of the angles, the ladar



data is identical and this corresponds to open field. However, at scan angles of 85 degrees to 95



degrees, the frontal distance drops down to approximately 450cm from the ground data which



was approximately 6m in the previous scan shown above. This large change in data corresponds



to the presence of an obstacle in front of the vehicle. This can also be interpreted as the obstacle



cutting the laser to the ground, thereby showing a smaller distance (Figure 8-6).


3000



2500



2000
E
U
a
c
'1500
rn
O
.I
P
rr
1000



500


Scan Angle (deg.)


Figure 8-5. Ladar scan data with obstacle present in front of the vehicle










No Obstacle


Obstacle


's 6m


Ground


Figure 8-6. Illustration of the obstacle detection using ladar

The obstacle detection algorithm is based on the above analysis of data described above.

The algorithm is as follows:

* Prior to operation in autonomous mode, when the vehicle is stationary, obtain the ladar
scan radial distance corresponding to 90 degree. Determine this distance as the ground
distance from the ladar.

* Operate the vehicle in autonomous open field navigation mode.

* Monitor the ladar data continuously and note sudden drops of more than 50cm from the
ground distance.

* If a drop is encountered at an angle, move a data window of length 5 through the 5
subsequent data.

* If all of these data are also less than 50cm from the ground distance, determine the
presence of an ob stacle.

* If an obstacle is determined, stop the vehicle.

* Continue monitoring the scan data for the presence of obstacle.

* If no obstacle is found, continue navigating using GPS based navigation.

The stop signal is sent to the PC104 computer as a special stop command to reduce the

speed to zero. No change is made to the steering control. The algorithm uses a data window of 5

data points to detect obstacles. This corresponds to an angle of 3 degrees. It was determined

experimentally by trial and error that an angle of 3 degrees translates to sufficient width to

determine the presence of an obstacle such as a human. Wider obstacles such as cows are










expected to be determined. The threshold distance of 50cm was chosen so that small terrain

elevation changes are not detected as obstacles. The angular width of 3 degrees may not be

sufficient to detect obj ects such as poles. This however is useful in preventing the detection of

tall grass as obstacle. The flowchart of the algorithm is shown in the Figure 8-7.


Figure 8-7. Flowchart of the obstacle detection algorithm










Experimental Methods


GPS Based Navigation

Experiments were conducted to test the performance of the GPS based guidance to

navigate open fields. For conducting the experiments, an open area was chosen near Fifield Hall

in the University of Florida campus. The open field was an uneven terrain covered with grass. In

navigating the open areas between groves, it is expected that the GPS coordinates would be

provided apriori as way points for the vehicle to follow. The guidance system is developed to

navigate citrus grove alleyways only. The vision or ladar algorithms are not capable of

navigating unknown regions in open fields. Therefore, the distances between way points should

be such that only straight line travel would be necessary between way points. Therefore, the way

points were chosen approximately 13m to 16m apart, representative of distances requiring only

straight line travel between way points. The exact distance between the way points were not

controlled. Marker flags were manually placed on the ground representing the approximate

location of the way points. The way point locations were chosen such that the vehicle would be

required to travel in different directions to complete the navigation. This way, direction based

errors from the IMU and the GPS could be included in the tests. The location of the way point

flags for the experiment is shown in the Figure 8-8. The flags are indicated by the small circles in

the figure.





East
Os OF 7y 03 40
Vehicle Path
Start


Figure 8-8. Waypoint location and vehicle's path for the GPS based navigation









The way point flags are numbered based on the sequence in which the vehicle is required

to navigate the way points. The illustration of the path taken by the vehicle is shown using

dashed arrow lines in the Eigure. It is to be noted that the way points are located beside the way

point flags and not exactly at the location of the flags. This was done to prevent the vehicle from

driving over the flags. The location of the geographic east is also shown in the Eigure. For the test

runs, the start point was located some random distance west of the last or 8th way point. The

experimental procedure followed was as follows

1. Prior to the start of the experiments, the GPS receiver was turned on for approximately
15min until it acquired enough satellites to obtain DGPS fix and provide accurate data.

2. Collect way points in the form of GPS latitude and longitude coordinates: The DGPS
receiver in the vehicle was turned on to receive the GPS coordinates using the software. The
vehicle was manually driven to a random location beside the first way point flag and stopped.
The software was used to save the latitude and longitude coordinates of a point in a text file.
The vehicle was driven to random locations near subsequent way point flags, stopped and the
latitude and longitude were saved on the same text file. The sample coordinates saved in the
text file is shown below.

29.6444675,82.3460763 & Way point 1

29.6444933,82.3464116 4 Way point 2

29.6444341,82.3467433 & Way point 3

29.6446468,82.3467340 + Way point 4

29.64466 16,82.3464736 4 Way point 5

29.6446500,82.3461150 + Way point 6

3. The coordinates were comma delimited. The saved coordinates were in degrees as converted
in the software. At each way point saving location, the location of the center of the vehicle' s
width, directly below the DGPS receiver was marked on the ground using a colored marker.
This represented the exact physical location of the way point.

4. Provide the way points in a text file for the guidance system to read: The text file with the
latitude and longitude coordinates of the way points was given as input to the navigation
software.

5. Pick a start point and start the vehicle in autonomous GPS based navigation mode: The
vehicle was manually driven to a random location a few meters west of the last way point.










The vehicle was stopped. The autonomous GPS based navigation algorithm was run in the
software. The vehicle was switched to the autonomous mode and allowed to navigate.

6. Allow the vehicle to follow the way points until it reaches the last way point: The vehicle
was allowed to navigate autonomously and navigate the way points in the numbered
sequence mentioned earlier. As the vehicle reached each way point, the location of the center
line of the vehicle's width, when it is closest to the marked way point, was marked on the
ground using colored marker.

7. The vehicle was programmed to stop motion after reaching the last way point.

8. The distance between the marked way point and the newly marked navigated point was
measured manually using a tape measure. This distance represents the error of the vehicle in
reaching the way point. This was performed for all the way points.

9. After taking measurements, the markers were removed. Three test runs were conducted for
repeatability. The same way points were used for the three repetitions. The average speed of
the vehicle during the test runs was 1.8m/s.

Obstacle Detection

Experiments were conducted to verify the operation of the obstacle detection algorithm' s

ability to detect obstacles. Two different types of obstacles were chosen, a human of height

approximately 1.8m and a cubical box of height approximately 0.9m. In this experiment, the

vehicle was started in autonomous mode from a stationary position and allowed to travel using

GPS based navigation from one point to another. In one experiment, on the path of the vehicle,

the box was placed stationary and the vehicle was allowed to navigate the path, detect the

obstacle and stop until the box was removed. In a second experiment, the vehicle was allowed to

navigate as in the first experiment. A human walked slowly in the path of the vehicle. The

vehicle was allowed to navigate, detect the human obstacle and stop until the walking human

moved out of the range of the vehicle' s path. The experiment was to confirm the obstacle

detection capability. The distance of the vehicle when stopping for the obstacle was observed.

Three repetitions were performed by placing the obstacles at random locations in the path of the

vehicle.












GPS Based Navigation

From the experiments, the distance of the vehicle width' s center from the way point was

reported as the error in reaching the way point. The distances were averaged over the three

repetitions and over all the way points and reported as the average error for the GPS based

navigation for the experiment. The results are provided in the Table 8-1.

Table 8-1. Error of the vehicle for GPS based navigation experiments

Way point
Trial 1 2 3 4 5 6 7 8

S1 31 55 39 21 38 18 46 12

8 .E 2 45 43 15 6 17 23 54 26

3 27 34 24 43 40 31 35 29



From the results, it can be seen that the error is distributed among all the way points. The

average error over all way points was 31.75cm. The errors cannot entirely be attributed to

navigation of any particular way point or terrain change near one way point. As mentioned

earlier, the DGPS receiver is accurate to 12inches or approximately 30cm as specified by the

manufacturer. Also, the GPS based navigation algorithm considers a way point as being reached

if the vehicle is less than 50cm away from the way point. With the accuracy of the receiver,

accuracy less than 30cm cannot be expected for all test runs. Considering the accuracy of the

DGPS receiver and the allowable error in the algorithm, the average error of 31.75cm seems to

be of good accuracy. However, it should be noted that the accuracy in navigation is variable on

several factors such as the number of satellite signals being received by the DGPS receiver, the

errors in DGPS measurements due to motion of the vehicle, errors due to uneven terrain and


Results and Discussion










presence of trees in the vicinity blocking the signal. Some of the low errors of 6cm or some of

the high errors of 54cm may be attributed to the vehicle navigating a subsequent way point in a

direction of motion resulting in error or to DGPS accuracy at a particular instant of time.

Therefore, the individual errors and average error are only an indication of the validity of using

the DGPS receiver to navigate the open Hields and may not be taken as a measure of accuracy of

the guidance system. The errors also indicate the amount of permissible error that is expected to

be available for the vehicle to navigate the open Hields. While navigating Hields between citrus

groves, the way points should be chosen to allow sufficient error or if higher accuracy is

required, ladar or vision based navigation could be used to augment the guidance performance.

Obstacle Detection

From the experiments conducted for the obstacle detection, it was observed that the

guidance system was able to detect both the obstacles during every test run. Both the obstacles

were detected at a distance of approximately 5m in front of the vehicle. The mounting angle of

the ladar was such that the ladar scan could detect the ground at approximately 6m. Therefore,

when the obstacles crossed the ladar scan prior to the ground, they could be detected. It was also

observed that the vehicle remained stopped until the obstacles were removed. These observations

confirm the ability of the obstacle detection algorithm to detect obstacles of the size of humans

or boxes. It is expected that that animals such as cows or goats would also be detected due to

their size. However, the obstacle detection system is not expected to detect tall grasses or thin

poles, due to the algorithm specifications. Extensive tests using various types of obstacles are

required before the ability of the system to detect different type of obstacles can be confirmed.

As the obstacle detection algorithm is based only on ladar, it is uncertain what kind of obj ects the

laser beam can reflect off of. A more robust obstacle detection system could include vision in

combination with the ladar. However, addition of vision could cost significant processing time.









Conclusions

GPS based navigation method was developed to navigate open fields between grove

blocks. The method used way points to navigate the vehicle in small distances. A DGPS receiver

was used to collect way point coordinates. The way points were provided as latitude and

longitude coordinates. The coordinates were used to steer the vehicle to each of the way points.

Experiments were conducted to test the performance of the open field navigation system. The

vehicle was able to reach all of the way points with an average error of 3 1.75cm from the way

point location. This accuracy was comparable to the accuracy of the DGPS receiver. An obstacle

detection algorithm was developed to detect obstacles while navigating open fields. The

algorithm used the ladar for detecting obstacles. Tests were conducted to verify the obstacle

detection ability. The algorithm was successfully able to detect human and box obstacles. The

algorithm is expected to detect obstacles of the size of cows and animals of similar size.









CHAPTER 9
RESEARCH SYNTHESIS AND OVERVIEW

Cost and limited labor availability project robotics and automation as promising methods

of operating citrus groves. Autonomous vehicles form an integral part of such an initiative. The

development of guidance system for an autonomous vehicle operating in citrus groves was the

subject of this dissertation. Previous efforts at developing autonomous vehicles were discussed in

Chapter 2. Most of the efforts have been for developing autonomous vehicles for use in crop row

navigation. Current commercial systems also target crop rows. This research furthers the

development of autonomous vehicles for use in citrus groves.

This research was started with a tractor as the base vehicle for the development of the

guidance system. The tractor was readily available and suited for the application. The tractor was

retro-fitted with an electro-hydraulic valve to enable it to be controlled using electric signals. The

retrofit ensured retaining the manual operation of the vehicle. The electro-hydraulic valve was

calibrated to determine the voltage and time required to change the steering angle. During the

calibration, viscosity of the hydraulic fluid was found to play an important role in the steering

time. A rotary encoder was mounted on the wheel shaft and calibrated to determine the steering

angle. This provided closed loop control of the steering. The dynamics of the vehicle steering

system was determined experimentally. A DGPS receiver was primarily used for the experiments

to determine the position of the vehicle. The dynamics of the vehicle provided a mathematical

expression for simulating the vehicle for developing the steering control system. PID based

steering control system was developed by simulation using the vehicle dynamics model. The

steering control was implemented in a microcontroller using C++ for software development.

For path determination, machine vision and ladar were chosen as the primary sensors.

Previous efforts have used GPS receiver as the primary sensor for navigation. Presence of tall










trees in or around the grove does not encourage the use of GPS based systems in the grove. A

video camera was used as sensor for machine vision development. The two primary sensors were

mounted on top of the tractor cabin looking ahead of the vehicle. To develop the machine vision

algorithms, first a database of images was obtained from the grove. The images were obtained

throughout the day. The images accounted for varying cloud conditions and shadow conditions

in a grove alleyway over a day. The image database was used to develop the machine vision

algorithm. The algorithm used color based thresholding followed by morphological operations to

segment trees and the path. An important constraint during the development of the algorithm was

the processing time available in the PC. The PC was expected to run the ladar, the IMU, the

GPS and all of the high level algorithms. All these processes running in a single processor

limited the processor time available for each algorithm. A ladar based path segmentation

algorithm was also developed. The algorithm was based on distance of objects in reference to the

vehicle. The two methods were able to segment the path and the tree.

The developed path determination algorithms together with the steering control system

were implemented in software. Experiments were conducted in custom designed test tracks using

hay bales as path boundaries. The main aim of the experiments in the test track was to ensure

proper operation of the various components of the guidance system including sensor interface,

algorithms and the software interface between the components. Experiments were then

conducted in citrus grove alleyways to verify navigation of the alleyways by the guidance

system. The experiments also helped determine the accuracy of the guidance system in

navigating the vehicle at the center of the alleyway. Vision based navigation was found to

navigate the path well. The ladar based algorithms developed in the test track were not directly










adaptable to the grove conditions. Filtering of the ladar data was required due to uneven tree

canopy.

The experiments using independent machine vision and ladar for navigating the grove

alleyways provided insights into the capability of these sensors. Machine vision had a long field

of view and navigation was not as accurate as the ladar. At the end of the alleyways, machine

vision capability was lost due to the longer field of view beyond the alleyway. Ladar had shorter

and more accurate field of view but was inaccurate for short durations in the ab sence of a tree on

the alleyway. To utilize the advantages of the sensors, fusion algorithms were developed to fuse

the information from these two sensors. Kalman filter was chosen as the primary fusion method.

Fuzzy logic was used to select the appropriate sensor for the end of the alleyway and while

missing trees are encountered. Inertial measurement unit and a speed sensor were added to aid

the fusion process. Experiments were conducted in test tracks to ensure proper operation of the

guidance system. Experiments were then conducted in citrus grove alleyways. Fusion based

guidance was found to perform better than individual sensor based guidance. The navigation of

the center of the path was also more accurate.

From the experiments using the tractor, it was found that transporting the tractor to the

grove for testing was expensive. To further the abilities of the vehicle, speed control, reversing

and start-stop capability was important. A golf cart type vehicle, eGator was donated to the

research by John Deere. The eGator was an electric vehicle. The vehicle was suited for easy

control of the steering, speed and adding star-stop capability. It was also convenient to transport

the vehicle. However, documentation on the vehicle was minimal. The vehicle was reverse

engineered to be used for autonomous operation. The entire guidance system was adapted for the









eGator. Pure Pursuit steering control algorithm was adopted for steering the eGator. The method

does not require the knowledge of the vehicle dynamics. It was also intuitive to develop.

After navigating the alleyway of a citrus grove, the vehicle should be able to enter

subsequent alleyways by turning at the headlands. This would provide the vehicle with the

ability to completely navigate the grove. Algorithms were developed to turn the vehicle at the

headlands of the grove. The algorithm used machine vision for detecting the approach of the

headland and sweeping ladar to accurately determine the location of the headland. U-turn and

switch- back turning algorithms were developed to turn at the headlands. Experiments were

conducted to verify the ability to tumn at the headlands. The switch-back turn is capable of

turning within a smaller area but took slightly more time to execute. The headland turning ability

allowed the vehicle to completely navigate a grove block.

Often large citrus groves are separated into smaller grove blocks. An autonomous vehicle

capable of moving from one grove block to another would be an added capability. The area

between the grove blocks is expected to be an open Hield or roadway. GPS based open Hield

navigation algorithm were developed. The algorithm used latitude and longitude coordinates of

way points to navigate the vehicle. In open Hields, obstacles are expected. Therefore, obstacle

detection algorithms using ladar was developed. The vehicle possesses the ability to stop in the

presence of obstacles. Experiments were conducted to verify the ability of the vehicle to navigate

open fields using way points and to test the obstacle detection capability. Open Hield navigation

together with obstacle detection enables the vehicle to traverse the area between one grove block

to another. The vehicle now possesses the ability to completely navigate a citrus grove.









CHAPTER 10
CONCLUSION AND FUTURE WORK

Conclusions

The development of the autonomous vehicle guidance for navigating citrus groves

contributed to the global body of knowledge on autonomous vehicles both for agricultural and

non-agricultural applications. The research specifically contributes to the research efforts in

utilizing robotics for citrus grove operations. The development of the autonomous vehicle

guidance shows machine vision and ladar based systems as promising methods for grove

operations as an important alternative to GPS based guidance systems for crops. The presence of

shadows is a problem in many outdoor applications using machine vision. The adaptive machine

vision thresholding technique was found to be a method for dealing with the presence of

shadows. High accuracy in navigating the center of the grove alleyways was achieved using

sensor fusion based guidance. Utilizing the reliability of different sensors for varying

environmental conditions, while fusing the sensor information, was an important milestone in

improving the reliability of the overall guidance system. Headland turning ability gives the

ability to completely navigate the entire grove block. Sweeping ladar based detection of

headlands is an important contribution in this effort. GPS based open field navigation and

obstacle detection abilities enable the autonomous vehicle to navigate the area between grove

blocks. All of the experiments for this development were conducted in one grove. Further efforts

are needed in testing the vehicle in a variety of groves and tuning the algorithms to ensure

reliability of the autonomous vehicle guidance in all groves.

All the obj ectives of this research were successfully completed. The vehicle currently

possesses the ability to completely navigate grove blocks and travel from one grove block to

another.









Future Work

The development of the autonomous vehicle guidance for navigating citrus groves is

presented in this dissertation. Although all the obj ectives of this research were completed,

significant future work is required before the guidance system can be reliably used and

commercialized. This research on vehicle guidance encompassed a vast area and includes

numerous sub components for the operation of the overall guidance system. Each component of

the system invites significant further research. The choice of the vehicle in this research was

guided by the availability of the vehicle. Investigation is required to determine the suitability of a

vehicle for the application. The choice of control systems namely PID and Pure Pursuit were

based on previous research. Other control systems for navigating groves could be explored.

The machine vision algorithm presented in this work utilizes color based segmentation of

trees and path. The development of the algorithm was limited by the computer processing time

available when running multiple algorithms on a PC. Other machine vision methods such as

texture and shape of obj ects would make the segmentation more reliable. This choice is largely

guided by the processing power available and optimal programming techniques, which were not

the focus of this work.

Improved software utilization such as parallel processing could be investigated for this

purpose. Use of multiple processors is a suitable option for long term improvement of the

guidance system. Windows operating system used in this work runs numerous non-essential

operations in the background. This reduces the processing time available for real time operation

of the guidance system. A leaner real time operating system should be explored.

Obstacle avoidance ability was not pursued in this research. Within the grove alleyways,

the amount of space available for the vehicle to navigate is small. It is to be determined whether

obstacle avoidance is feasible within the alleyway. For navigating open fields, obstacle detection










capability was included. The system requires the obstacle to be moved from the path for the

vehicle to continue navigation. In autonomous operation, movement of obstacles such as animals

cannot be expected. Obstacle avoidance ability could be incorporated to navigate around the

obstacle. The avoidance algorithm may be required to account for moving obstacles such as

moving animals.

The variations in the planting of trees differ from grove to grove. The headlands also

differ. For example, one particular grove where the autonomous vehicle was tested consisted of

deep pits at the headlands for drainage. The headland turning algorithms developed in this work

were not capable of maneuvering the pits and the navigation failed. Therefore, all the varying

grove conditions need to be captured for improving the algorithms. Rigorous testing in several

groves is also required for ensuring reliability.









APPENDIX A
CODE SEGMENT FOR PID BASED STEERING CONTROL

The relevant sections of the code used in the software for the implementation of the PID

based steering controller are presented here.


PID based steering control for steering the tractor
Inputs: Error from the PC through serial communication
Outputs: Voltage to the electro-hydraulic valve


// e=error,eo=previous error,de=change in error

int Kp,Kd,Ki,e,eo,de, errorinteg, Gain;

void main(void)

// Microcontroller 586E initialization
sc init();

//Data initialization
for(i=0;i<2001;i++)

angle_data[i]=0;
time_data[i]=0;


// Decoder Variables initialized
clkt sel(5); // 36.864 MHz
hpl= 0; hp2= 0;

//Initialize serial communication

cl = &serl com;
baud = 9; /* 38400 baud for SER1 */
isize=MAXISIZE;
osize=MAXOSIZE;
sl init(baud,serl~in buf,isize,serl out buf,osiec)
del ay~m sl(200);

// Analog output initialize

// Center point of steering
dac=38000;









// Various useful voltage values output of the AD converter in microcontroller
//0 = OV, 10000= 0.383V,20000=0.764
//30000= 1.145V, 40000=1.526V,50000= 1.907V
//60000=2.288V, 65520=2.499V


// Voltage output of amplifier
//tern 0 = 3.46 V = 0, tern 1.25= 6.54V


37620, tern 2.5


9.66V = 65520


// Send values out to the ports
outport(0x 18EO,dac);
outport(0x 18E2,dac);
outport(0x 18E4,dac);
outport(0x 18E6,dac);


// decoder operation initialized
hp2=inportb(HP2+2)+(inportb(HP2)<<8);
position_center-hp2;
actual_po siti on_current=0;
actual_position_previous=0;
position_current=hp2;


// Initial decoder Output value
// saving the value as center


// Initialize PID variables
e= 0;
co=0;
de=0;
errorinteg=0;
Gain= 0;

// Start the control loop

while(1)


j=0;jl=0;disp=0; ptm=0; Gain=0;

hp2=i np ortb (HP2+2)+(i np ortb (HP2)<< 8);


// Decoder Output

// Limits of decoder


if(hp2>=200 || hp2<=-200)


hp2=hp2prev;


hp2prev=hp2;

// Serial data received from PC
if(serhitl (cl))









m=getserl(cl); // Get the start of data character


if(m:


==112)


if( serhitl(cl))


// sent by PC
j = getserl(cl);


// get the 1st character


if( serhitl(cl) )


// hit by PC1
jl getserl(cl);


// get the 2nd character


if(serhitl(cl))


// get the 3rd character


j2=getserl(cl);


if(serhitl(cl))


j3=getserl(cl);


// get the 4th character


// Reforming the data from the characters received

if(j !=45 && m==112) // Right turn steering

ifgi2!i=0)

disp=(j-48)*100+(j l-48)*10+j2-48;

else if(j l!=0)


disp=(j-48)*10+j l-48;


else


di sp=j -48;


else if(m==112) //left turn steering

ifgi 3!i=0)










disp=-((j l-48)*100+(j2-48)*10O+j 3-48);

else if(j2!=0)

disp=-((j l-48)*10+j2-48);

else

disp= -(j l-48);





del ay~m sl(3 0); //30ms delay

//PID control
if(disp!=0) // non-zero error 'disp'

//if disp is +, move right, if -, move left
e= disp;
Kp=1; // proportional gain
Kd=4; // Derivative gain
Ki=0.09; // Integral gain
de=e-eo; // change in error
eo=e; // update previous error

errorinteg=errorinteg+e; // integral error

Gain=Kp~e+Kd~de+Ki "errorinteg; //PID gain

ptm =0.5*Gain *140/20; //Angle to Pulse conversion: 20 degrees = 140pulses

ptm=ptm+hp2; // change in steering angle

//Limits for steering
if(ptm>1 80)

ptm=180;

else if(ptm<-180)

ptm=-180;









//Angle to voltage conversion
if(ptm>0 && ptm<50)


dac=45000+100*ptm;
if(dac>50000)

dac=50000;


else if(ptm>50)

dac=50000+30*ptm;

else if(ptm<0 && ptm>-50)

dac=29000+100*ptm;
if(dac<24000)

dac=24000;


else if(ptm<-50)


dac=24000+30*ptm; // Output voltage to Valve

else

dac=38000;



// if no steering is required
else if(ptm==0)

dac=38000; // straighten steering


//Voltage Limiter----------------------------------
// Voltage is limited to maximum values for safe operation
if(dac>55000)
{dac=55000; }
else if(dac< 19000)
{dac=19000; }









//Output voltage to Tern----------


outport(0x 18EO,dac);
outport(0x 18E2,dac);
outport(0x 18E4,dac);
outport(0x 18E6,dac);



// Reset steering to center before quitting
dac=38000;
outport(0x 18EO,dac);
outport(0x 18E2,dac);
outport(0x 18E4,dac);
outport(0x 18E6,dac);









APPENDIX B
CODE SEGMENT FOR VISION BASED PATH SEGMENTATION

/* Code segment for vision based path segmentation
Function: Imagproc()
Input: Pointer to image data in memory
Output: Lateral error of vehicle in the path
Calling Function: OnPaint()
Variables:
ProcDib: Memory pointer for image data
PW: Window size



// Thresholding assuming presence of shadows


for(int i=1;i<=240;i++)
for( int j =1 ;j <=640;j ++)
if( *(ProcDib+41+i*1 920+j*3)<=45)
{ (rci+lil90j")10
*(ProcDib+41 +i*1920+j*3+)= 10;
*(ProcDib+41 +i*1920+j*3 +1)=0;

if(j<320)
left count=1eft count+1;
else
right count=right~count+ 1;



// If non-shadow image, new thresholding


if(left~count<7000 && right~count<7000)

for(int i=1;i<=240;i++)
for( int j =1 ;j <=640;j ++)
if( *(ProcDib+41+i*1 920+j*3)<= 80)

*(ProcDib+41+i*1 920+j*3)= 150;//G
*(ProcDib+41+i*1 920+j*3+1)=0;//R
*(ProcDib+41l+i*"1 920+j*"3 +2)=0;//B










// Moving a 20*20 window for painting regions


for(int i=1;i<240;i+=20)
for(int j=-1;j <640;j +=20)

int countg=0;

for(int k=0;k<20;k++)
for(int l=0;1<20;1++)

if(*(ProcDib +41l+(i+k)* 1920+(j +1)* 3)== 150 & &&
*(ProcDib+41+(i+k)*1920+(j+1)*3+1)==0 & & *(ProcDib+41+(i+k)*1920+(j+1)*3+2)== 0)
county= countg+1;


if(countg>40)

{o~n =;<0k+
for(int k=0;k<20;k++)

*(ProcDib+41l+(i+k)*1 920+(j+1)*3)=150;
*(ProcDib+41+(i+k)* 1920+(j +1)* 3+1)=0;
*(ProcDib+41l+(i+k)*1 920+(j+1)*3+2)=0;






//Segmentation
//Painting regions to remove salt and pepper noise
//and can be used for determining image with lot of shadow


#define PW 15 //window size 15

for(int h=MyHeight;h>=0;h-=PW)

for(int w=0;w
int countg=0;
for(int k=0;k








for(int l=0;1

u = (((h+k)*MyWidth)+(w+1))*3;
==0 && *(ProcDdb+u+grn)==0 &&

county++;


if( *(ProcDdb+u+blu)
* (ProcDdb+u+red)==0)



if(countg>50)


for(int k=0;k

for(int l=0;1

u = (((h+k)*MyWidth)+(w+1))*3;


* (ProcDdb+u+blu)
*(ProcDdb+u+grn)
* (ProcDdb+u+red)


}
else


for(int k=0;k

for(int l=0;1

u = (((h+k)*MyWidth)+(w+1))*3;


(ProcDdb+u+blu)
(ProcDdb+u+grn)
*(ProcDdb+u+red)
} //1


=255;
=255;
=255;


} //else county
} //width
} // height

//----Left side wall---------------------------------
// find the left wall

for(int h=MyHeight/2-100; h<=MyHeight-100; h++)









Eitter[left] [h]=0;
start=0;fini sh=0;


// clear left fitting


for(int j =MyWi dth/2;j >=0;j --)//mi ddle to left end
{
u = ((h*MyWidth)+j)*3;


//if you see white
if(start==0 && *(ProcDdb+u+blu)


=255)


start=J ;


// if you see black after seeing white
if(start!=0 && *(ProcDdb+u+blu)==0 && Einish

Eini sh=j 1;
fitter[left] [h]=finish;
if(left~startx==-1)
{left~startx=f inish;1eft~starty=h; }
if(left~endx!=-1)
{left~endx=-1; }


if(left~starty !


=-1 && fitter[left][h]==0 && left~endx==-1)
{ leftendx=f itter[l eft] [h-1];1eftendy=h-1; }


//----Right side wall---------------------------------

for(int h=MyHeight/2-1 00;h<=MyHeight-100O;h++)


fitter[right] [h] =MyWi dth; // clear right fitting
start=0;fini sh=MyWidth;
for(int j =MyWi dth/2;j <=MyWi dth;j ++)


u = ((h*MyWidth)+j)*3;
if(start==0 && *(ProcDdb+u+blu)


=255)


start=J ;


if(start!=0 && *(ProcDdb+u+blu)==0 && finish==MyWidth)










Eini sh=j ;
fitter[right] [h]=finish;
if(right~startx==-1)
{right~startx=finish;right~starty=h; }
if(right~endx!i=-1)
{right endx=-1; }





if(right~starty !=-1 && fitter[right] [h]==MyWidth && right endx==-1)
{ right~endx=fitter[right] [h-1];right~endy=h-1 ; }

i f(l eft~endy //left turn especially for hay bales when u cant see on side for hard turns

if(right~startx-left~endx< 170)

turnleft=true;
for(int h=MyHeight/2-50;h<=1eft~endy;h++)

Eitter[right] [h]=fitter[left] [h]-15;
fitter[left] [h]=0;




else if(right endy //for hay bales when u cant see on side for hard turns

if(right~endx-l eft~startx<1 70)

turnright=true;
for(int h=MyHeight/2-50;h<=right~endy;h++)

fitter[left] [h]=fitter[right] [h];
fitter[right] [h]=MyWi dth;




//bottom few rows on the image when the boundaries are not visible
for(int i=MyHeight/2-1 00;i<=MyHeight-100O;i++)









if(fitter[left][i]==0 && fitter[left][i-1]!=0)
{ fitter[left][i]=fitter[left][i-1]; }
if(fitter[right] [i]==400 && fitter[right][i-1]! =400)
{ fitter[right] [i]=fitter[right] [i-1]; }


// goto skipper;

GetB oundary(l eft); // Line fitting left boundary

GetB oundary(right); // Line fitting right boundary

for(int h=MyHeight/2-100 ;h<=MyHeight-100; h++)


x[left]=al[left]+bl [left]*h;
wall[left] [h]=x[left]-50;

if(x[left]>0 && x[left]<400)

if(brick[left] [0]==-1)

brick[left] [0]=wall[left] [h];

else

bri ck[left] [ 1]=wall [left] [h];



x [right] =al [right] +bl [right]*h;
wall[right] [h]=x[right]+60;

if(x[right]>0 && x[right]
if(brick[right] [0]==-1)

brick[right] [0]=wall[right] [h];

else

bri ck[right] [ 1]=wall [right] [h] ;



if(wall[left][h]>=0 && wall[right] [h]<=MyWidth)










fitter[middle] [h]=(wall[right] [h]-wall[left] [h])/2;
// distance between walls

fitter[middle] [h]=wall[left] [h]+fitter[middle] [h]+40;
// centered between left & right walls

if(wall[left][h]==0 && wall[right] [h]==MyWidth)
{ fitter[middle] [h]=fitter[middle] [h-1]; }

//if only one side wall is visible, travel at 4ft=108pix from the visible wall
else

if(wall[left][h]>0 && wall[right] [h]>=MyWidth)

fitter[mi ddle] [h]=wall [left] [h] +20 8;

else if(wall[left][h]<=0 && wall[right] [h]
fitter[mi ddle] [h]=wall [right] [h] -20 8;



GetB oundary(mi ddl e); // Line fitting center

for(int h=MyHeight/2-100 ;h<=MyHeight-100; h++)

x[middle]=al[middle]+bl[middle]*h;

if(x[middle]>0)

if(brick[middle] [0]==-1)

brick[middle][0]=x[middle];

else

brick[middle][1]=x[middle];




// Line for true center-------------
int w-0;
w=(brick[middle] [0]+brick[middle][1 ])/2;
ppp'w;










for(int h=(MyHeight-100-20) ;h<=(MyHeight-1 00+20);h++)

// used for error calculation
u = ((h*MyWidth)+w)*3;
*(ProcDdb+u+blu)=200; // blue
*(ProcDdb+u+grn)=200; // green
*(ProcDdb+u+red)=0; // red

skipper:
return 0;


/* Function: GetBoundary
Calling function: Imagproc()
Input: Array of numbers
Output: Line fitting for the numbers

void GetBoundary(linpos lor)

float sqxl,sxl,pxyl,scl, nral,nrbl,dr;

sqxl=0;sxl=0;pxyl=0;noxl=0;scl=0;

for(int i=MyHeight/2-1 00;i<=MyHeight-100O;i++)

if(fitter[lor][i]!i=0 && fitter[lor][i]!i=MyWidth)

noxl++;
sqxl=sqxl+i~i;
sxl+=i;
pxyl=pxyl+i*Hftter[lor][i];



for(int i=MyHeight/2-1 00;i<=MyHeight-100O;i++)
scl= scl+fitter[lor] [i]i;

nral=scl~sqxl-sxl~pxyl;
dr=noxl" sqxl-sxl" sxl;
nrbl=noxl~pxyl-sxl" scl;
al[lor]= (float)nral/dr;
bl[lor]= (float)nrbl/dr;
brick[lor] [0]=-1;
brick[lor][1]=0;









APPENDIX C
CODE SEGMENT FOR LADAR BASED PATH NAVIGATION

/* Function: LadarGuidance()
Calling function: OnLadarData()
Input: Pointer to raw data location
Output: Lateral error of vehicle in the path


void LadarGuidance()

ladarflag=true;
//Find horizontal and vertical distance of environment

// cosine gives the lateral distance
// sine gives the frontal distance
for(int i=1l;i< SIC.smy;i++)

angle= (float)i/2 ; // angle

// Horizontal distance coordinate
x_dist[i]=-SIC.MVA[i]*cos(angle*3 .1428/180);

// frontal distance coordinate
y_dist[i]= fabs(SIC.MVA[i]*sin(angle*3.1428/180));


//initialize ground distance to ladar
if(grnd==0)

grnd=y_di st[ 180];


tree left sighted=false; //tree has not been sighted
left wall_position=1 80;

//Scan for left wall
//It is assumed that tthe vehicle is positioned in center of path
//centre position is assumed to point to ground
//If any frontal distance blob is less than 50 of frontal ground distance, it is tree

//left end to center
for(int i=180;i>0+10;i--)

if(y_dist[i]
//infer found tree









tree left sighted=true;


//Check if the inference is true
//Check if 5 succesive points agree with the inference
for(int k= 1;k<= 5;k++)

if(y_dist[i-k]>grnd-50)

tree leftsighted=false;
//di sagree
break;



//If tree found, stop checking left
if(tree_1eft~sighted==true)
//decided

left wall_position=i;
break;




//Scan for right wall

grnd=y_dist[180];tree~right~sighted=false;rih wall_position=180;

for(int i=180;i<361-10;i++)

if(y_dist[i]
//seems to have found tree
tree right~sighted=true;

//Check if the inference is true
for(int k= 1;k<= 5;k++)

if(y_di st[i+k] >grnd-20)

tree right~sighted=fal se;
//di sagree
break;


//If tree found, stop checking right










if(tree_1eft~sighted==true)
//decided

ri ght wal l_po siti on-i ;
break;




//Distance from the two walls
//note: right wall is actual left wall, since scan is reverse in ladar

angle= (float)1eft wall_position/2;
distance~from~right~wall=fabs(SIC.MVA[left alpsto]csfot (angle)*3. 1428/1 80));

angle= (float)right wall_position/2;
distance~fromef't~wall=fabs(SIC.MVA[right walpsto]cos(float(angle)*3 .1428/180));

// Lateral error finding
lateral_error = di stance~from left wall-di stance~from right wall ;
ppl=200-0.74*"lateral_error;

//74.07=180pix/2.43m to convert to pixels for pure pursuit
//error will be negative if right turn is to be made and vice versa

//When there are not trees on both sides or one side

if(di stancefrom left wall>3 00 && di stance~from right wall>3 00)
//no tree both side
{ ppl=0; }
else if(distance_from lefitwall<300 && distance~from~right wall>300)
//no tree right side

{ lateral_error=di stance~from left wall-225 ; ppl=200-0.74*"lateral_error; }
else if(distance_from lefitwall>300 && distance~from~right wall<300)

//no tree left side
{ lateral_error=225 -distance~fromtright wall; ppl=200-0.74*"lateral_error; }
else if(di stance~from lefitwall<80 && di stance~from right wall<80)

ErrorL=1 ateral error/2;










APPENDIX D
CODE SEGMENT FOR SENSOR FUSION BASED ALGORITHMS


/* Code segment for sensor fusion
Function: Fusion()
Calling function: OnPaint() if both vision and ladar are operational
Input: Measurement values
Output: Lateral error of vehicle in pixel coordinates, ErrorF
Variables:
ErrorL: Error from ladar
Errors: Error from vision
Theta~imu: Change in heading
Yaw: Heading angle measurement
m Z: Measurement vector Z
m R: Measurement noise covariance matrix R
m A: State transition matrix A
T : Time
m Q: Process noise covariance matrix
m X: State vector
m H: Observation matrix
m temp: temporary matrix for storing data
m K: Gain
m P: State estimate error covariance matrix


void Fusion()

//Initialization of heading angle
if(ErrorL==0 && Errorl==0 && theta~imu~init==0)

theta jmu init=0;


if(theta imu init==0 )

theta~imu~init=yaw; //theta~init-initial yaw angle-initial vehicle angle


//change in heading
thetai mu=y aw-theta~imu_init; //actual yaw w.r.t initial angle

//Account for negative heading angles
if(thetarimu<0)
{teaimu=90thetIimu }V'LILII C~
else
{theta~imu=90-theta~imu; }











//Z array measurement vector inputs
mZ. m_Value[0][0] =Errorl;
mZ. mValue[1][0] =ErrorL;
mZ.mValue[2] [0]=atan2(float(heading2-headingl),60);
mZ.mValue[3] [0]=theta~imu~pi/1 80;
mZ. mValue[4] [0]=speed;

if(mZ.mValue[1][0]==0)

mR.mValue[0][0]=0.05;


//Reliability factor of sensor from fuzzy logic implementation
mR.mValue[0][0] = f vision;
m R.mValue[1][1] =f_1adar;

if(abs(mZ.mValue[1][0])>50 && abs(mZ.mValue[0][0])<50)

m R.mValue[1][1]=55; //high ladar measurement noise

else //otherwise

mp RmValue[1][1]=0. 15; //low ladar measurement noise


mA.mValue[0][2]= T~sin(mZ.mValue[3] [0]);

// Time update equations
//Xe=A*X;
m Xe-mA~m X;

//Pe= A*P*A.'+Q;
m temp=mA;
mt.MaatTransposemA);
m temp=mAmP;
m temp=m temp~mt;
mPe-m~temp+mQ;

//Measurement update equations
//K=Pe*(H.') *pinv(H*Pe*(H.')+R);

m temp l=mHmPe;

//H transpose
for(int r=0;r<3;r++)










for(int c= 0; c<5; c++)
mt. mValue [r] [c]=mH. mValue[c] [r];
//mmtemp2=m temp l~mt; //Asymmetric matrix multiplication
for(int r=0;r<5;r++)
for(int c=0;c<5;c++)

m temp2. mValue[r] [c]=0;
for(int i=0;i<3;i++)

m temp2. mValue[r] [c]+=m temp1. mValue [r] [i]*mt. mValue[i] [c];


m temp2=m temp2+mR;
m temp2=m temp2^'-1;

//mtemp3=mPe~mt;
for(int r=0;r<3;r++)
for(int c=0;c<5;c++)

m temp3. mValue[r] [c]=0;
for(int i=0;i<3;i++)
m temp3. mValue [r] [c]+=mPe. m_Value[r] [i]*mt. mValue [i] [c];


mK-mtemp3*m tmp2;

//X=Xe+K*(Z-H*Xe);
m temp4=mHmXe;
m temp4=mZ-m~tmp4;
m temp5=mK~mtmp4;
mX=mXe+m~temp5;

//P= (I-K*H)*Pe;
m temp=mKmH;
m temp=mI-m temp;
mP=m~temp~m Pe;

ErrorF=-mX.mValue[0] [0]; // New fusion based error
// Innovation vector
//m Z' = m Z -m H~m X
m temp6 = mH mX;
mZ' = mZ m~temp6;
//Update process noise
mQ[0][0] = f x;
mQ[2][2] = f thetaR;









APPENDIX E
CODE SEGMENT FOR PURE PURSUIT STEERING

/* Code segment for Pure Pursuit steering
Input: Lateral error of vehicle in pixels
Output: Steering angle to the CAN controller


// Right turn command
// sthi: Lateral error in pixel coordinates
if(sthi>200 && sthi!=402 && sthi!=404)

// Amount of turn
x_pix= sthi-200;

//Pixel to distance coordinates in m
x=0.0135*x_pix;

// Compute radius of curvature
r-(x/2)+(0. 186/x);

// Compute steering angle
steer=(int)(-3.27* r+44.3 8);

if(steer<=0)

steer=0;

steer=-steer;


// Left turn command
else if(sthi!=402 && sthi!=404)

x_pix=200-sthi;
x=0.0135*x_pix; //0.0135=2.43/180
r-(x/2)+(0. 186/x);
steer=(int)(-5 .8*r+52);

if(steer<=0)

steer=0;









APPENDIX F
CODE SEG1VENT FOR HEADLAND TURNING

/* Code segment for headland turning
Functions: treeladarprocess,HeadlandU(), Headland3pt()
Output: Turning information in pixel error


// Sweeping ladar to determine end of alleyway
// Input: ladar data
// Output: Distance to headland
void treedladarprocess()

//conversion from 3d polar coordinates to 3d cartesian coordinates
int k= 1;
for(int i=0;i< SIC.smy;i++)
for(int j =1 ;j<=1s;j++)

las_y[k]=1adarsweep[j ][i]*"sin(i*0.00875)*cos((j -23)*0.0175);
if(las_y [k]<0)

las_y [k]= -las_y [k];
} // absolute value

las x[k]=1adarsweep[j ][i]*"cos(i*0.00875)*cos((j -23)*0.0175);

if((i/2)<90)

las~z[k]=1adarsweeplj] [i]*"sin(i*0.00875)*sin((j -23)*0.0175);

else

las~z[k]=1adarsweeplj] [i]*"sin(3.1 5-i*0.00875)*sin((j -23)*0.0175);


// cropping for relevant data
if(las~z[k]>200)

else if(las_y[k]>3000)
{las x[k]=0;1as_y [k]=0;1asz[k]=0; }
else if(las~z[k]<-150)

else if(las x[k]>500)
{las x[k]=0;1as_y [k]=0;1asz[k]=0; }
else if(las x[k]<-500)
{las x'k]=0;1asy [k]L=0~l ;1aszLk=0; }









k=k+1;


//3d cartesian coordinates to 2d cartesian grid
int a,b;
int (*location)[3005];
location-new int[1005][3005];

for(int i=1;i<=1000;i++)
for(int j =1 ;j <=3000;j ++)
location[i][j]=0;
for(int i=1;i
if(las x[i]>-499 && las_y[i]>=1)

a=1as~x[i]+500;
b=1as_y [i];
location[a] [b]=location[a] [b]+1;



//finding location of trees
int m=0;int count;
int* meanlocationx= new int[3000];
int* meanlocationy= new int[3000];
for(int i=1;i<=2950;i+=20)
for(int j=1;j<=950;j+=20)

count=0;
for(int k=1;k<=50;k++)
for(int 1= 1;1<= 50;1++)
if(location[j +1] [i+k]>0)
{m cont=r cont~locato;n [j+~1] [ik; }L
if(count>5)

m=m+1;
meanl ocati onx [m]=j ;
meanl ocati ony [m]=i;



//finding path for vehicle
int i= 1;k=0;
int leftmax;
int rightmax;
for(int j=250;j<=3000;j+=250)










if(i> 1)

if((meanlocationy [i]-meanlocationy [i-1])>=250)

turn=meanl ocati ony [i 1]+51 ;
break;


leftmax=5 00;rightmax= 500;
whil1e(me anl ocati ony [i]i+ 50<=j )

if(meanlocationx[i]+50<500) //left side tree

if(leftmax<500 && leftmax { leftmax=meanlocationx[i]+50; }
else if(leftmax==500)
{ leftmax=meanlocationx[i]+50; }

else if(rightmax>500 && meanlocationx[i] meanlocationx[i]>5 00) //right side
{ rightmax=meanlocationx[i]; }
else if(rightmax==5 00 && meanl ocati onx [i]i>5 00)
{ rightmax=meanlocationx[i]; }

if(turn==0 && i
else
{break;}


if(leftmax==500)
{ leftmax=0; }
if(rightmax==500)
{ rightmax=1000; }
if(k>0)
{ break; }

if(leftmax==0&& rightmax== 1000)

k=k+1;
if(j==250)
{ tur=j-245; }

else if(i==m)









k=k+1;
turn= meanlocationy [m]+5 5;
break;



//turn
if(turn==0)
{turn=1; }

//delete large arrays
delete location;
delete meanlocationx;
delete meanlocationy;


// U-turn at headland
// Input: yaw angle
// Output: steering angle in pixel coordinates
void HeadlandU()

//headland algorithm is taking over from vision
if(MchnVsnAct)
{McrhnT~sn ~ct=fallse;}

if(headl andangle==40 0)

OutputDebugString("headland\n");
headlandangle=yaw;
correctionl=0;
correction2=0;
flaglI-false;
fl ag2=false;
delayflag=false;

//correcting headlandangle for anomalies
//due to pyr sensor having range 0 to 360
//causing addition and subtraction problems
//left turn
if(headlandmodel==1 && headlandangle>=335)
{corprectin1 = (headlonlandngle+3_25)-360;
else if(headlandmodel==1 && headlandangle<200)
{ correction = 360+(headlandangle-200); }
//right turns
else if(headlandmodel==2 && headlandangle<15)
{ corrections = 360-(1 5-headlandangle); }









else if(headlandmodel==-2 && headlandangle>=200) //200+160=3 60
{ correction = (200+headlandangle)-360; }


if(headlandmode l==1)//left turn

estr6h[0]='k'; // start of transmission data
Gator.Write((BYTE *)estr6h,1);

//right turn--------------------------------------------
//turn right till headlandangle>headl andangle+30O
if(correctionl!=0 && flagl==false) //extremes

OutputDebugString(" cor right\n");
if(yaw=340)
{ if(delayflag==true)
{C Sleep(500);delayo~flag=false; }
estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=201;
Gator.Write((BYTE *)estr7h,1);


else if(yaw<(headlandangle+20) && flagl==false) //regular

if(delayflag==true)
{ Sleep(500);delayflag=false; }
OutputDebugString("reg right\n");
estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=201;
Gator.Write((BYTE *)estr7h,1);


//left turn-----------------------------------------
//full turn left until headlandangle-180
if(correction2!i=0) //extremes


if(yaw>(headlandangle+20) || yaw>correction2-10) //>correction2

OutputDebug String(" cor left\n");
flagli-true;
estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=202;









Gator.Write((BYTE *)estr7h,1);

else //u turn completed
{ //get out to alleywaymode
OutputDebugString(" cor exit");
pathend=fal se;
headlandflag=false;
estr7h[0]=100;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);


else //regular

//OutputDebugString("regular left\n");
if(yaw>(headlandangle+20) && flagl==false)

OutputDebug String(" l\n");
flagli-true;

else if(flagl==true && yaw>(headlandangle-150)) //

OutputDebug String("reg left\n");
estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=202;
Gator.Write((BYTE *)estr7h,1);

else if(yaw<(headlandangle-150)) //<(headl andangle-200)
//u turn completed
{ //get out to alleywaymode
OutputDebugString("reg exit");
headlandflag=false;
pathend=fal se;
estr7h[0]=100;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);
flagli-false;




estr6h[0]='m'; //end of transmission data
Gator.Write((BYTE *)estr6h,1);










else if(headlandmodel==-2)//right tumn

sprintf(texty, "head:%4.2f\n" ,headlandangle);
OutputD ebug Stri ng(texty);
estr6h[0]='k'; // start of transmission data
Gator.Write((BYTE *)estr6h,1);

//left turn--------------------------------------------
//tum left till headl andangl e if(correctionl!=0 && flagl==false) //extremes

OutputDebug String(" cor left\n");
if(yaw>correction1 || yaw<15)

if(delayflag==true)
{ Sleep(5 00);del ayfl ag=false; }
estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=202;
Gator.Write((BYTE *)estr7h,1);


else if(yaw>(headlandangle-15) && flagl==false) //regular

if(delayflag==true)
{ Sleep(500);delayflag=false; }
OutputDebug String("reg left\n");
estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=202;
Gator.Write((BYTE *)estr7h,1);


//right turn-----------------------------------------
//full tumn right until headlandangle+200
if(correction2!i=0) //extremes

if(yaw<(headlandangle-15) && flagl==false)
{ flagl=true; }
else if(flagl==true && (yaw>=180 || yaw<(correction2-1 0)))

OutputDebugString(" cor rit\n");
estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=201;









Gator.Write((BYTE *)estr7h,1);

else if(flagl==true)//u turn completed
{ //get out to alleywaymode
OutputDebugString(" cor exit");

headlandflag=false;
pathend=fal se;
estr7h[0]=100;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);


else //regular

if(yaw<(headlandangle-15) && flagl==false)

flagli-true;

else if(flagl==true && yaw<(headlandangle+165))

estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=201;
Gator.Write((BYTE *)estr7h,1);

else if(yaw>(headlandangle+165)) //(headlandangle+180)
//u turn completed
{ //get out to alleywaymode
headlandflag=false;
pathend=fal se;
estr7h[0]=100;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);
flagli-false;



estr6h[0]='m'; //end of transmission data
Gator.Write((BYTE *)estr6h,1);



void Headland3pt()










//headland algo is taking over
if(MchnVsnAct)
{McrhnT~sn ~ct=fallse;}

if(flagl==true && flag2==true && delayflag==true)
{ Sleep(4000); delayflag=false; }

if(headl andangle==40 0)

headlandangle=yaw;
correctionl=0;
correction2=0;
flagli-false;
fl ag2=false;
delayflag=true;
delayflagl=false;
//correcting headlandangle for anomalies
//due to pyr sensor having range 0 to 360
//causing addition and subtraction problems
//left turn
if(headlandmodel==1 && headlandangle<=90) //left
{ corrections = 360-(90-headlandangle); }
//right turns
else if(headlandmodel==2 && headlandangle>=270)//right
{ corrections = (headlandangle+90)-360; }

if(headlandmodel==1 && headlandangle<=200) //left
{crnrection2 = 360+(lheadllandangle-200);}
else if(headlandmodel==2 && headlandangle>=1 60)//right
{ correction = (headlandangle+200)-360; }


if(headlandmode l==1)//left turn

estr6h[0]='k'; // start of transmission data
Gator.Write((BYTE *)estr6h,1);

//left turn------------------------------------- -------
//turn left till1 headlandangl e if(correctionl!=0 && flagl==false) //extremes
{ if(delayflagl==false)
{ Sleep(2000); delayflagli-true; }
OutputDebugString(" cor left 90\n");
if(yaw>(correctionl+10) || yaw<90)









estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=202;
Gator.Write((BYTE *)estr7h,1);

else
{flagl=true; }

else if(correctionl==0 && yaw>(headlandangle-90) && flagl==false)
//regular

if(delayfl ag l==false)
{ Sleep(2000);de layo~flagl=true; }
estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=202;
Gator.Write((BYTE *)estr7h,1);

else
{ flagli-true;) }/90deg turn completed

//left turn-----------------------------------------
//full turn left until headlandangle-180
if(flagl==true)

if(flag2==false) //reverse

estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=203 ;
Gator.Write((BYTE *)estr7h,1);
{flag2=true; }

else //reverse over now turn

if(correction2!i=0) //extreme

if(yaw>(correction2+50) ||I yaw<110) //200-90=110

estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=202;
Gator.Write((BYTE *)estr7h,1);

else //completed so exit to alleyway mode









headlandflag=false;
pathend=false;
estr7h[0]=100;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);


else //regular

i flyaw>(he adl andangle- 1 50))

estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=202;
Gator.Write((BYTE *)estr7h,1);

else

headlandflag=false;
pathend=false;
estr7h[0]=100;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);


} //rev over


estr6h[0]='m'; //end of transmission data
Gator.Write((BYTE *)estr6h,1);

else if(headlandmodel==-2)//right turn

estr6h[0]='k'; // start of transmission data
Gator.Write((BYTE *)estr6h,1);

//rit turn---------------------------------------------
//turn rit till yaw>yaw+90
if(correctionl!=0 && flagl==false) //extremes

if(delayfl ag l==false)
{ Sleep(3000);de layoflagl=true; }
OutputDebug String(" cor rit 90\n ");
if(yaw<(correctionl1-20) ||I yaw>270)










estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=201;
Gator.Write((BYTE *)estr7h,1);


else if(yaw<(headlandangle+70) && flagl==false) //regular

if(delayfl ag l==false)
{ Sleep(3 000);delayflagli-true; }
OutputDebugString("reg rit 90\n");
estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=201;
Gator.Write((BYTE *)estr7h,1);

else
{flag l=true;: } //90deg tulrn complete

//left turn-----------------------------------------
//full turn left until headlandangle-180
if(flagl==true) //extremes

if(flag2==false) //reverse

OutputDebug String("rev");
estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=203 ;
Gator.Write((BYTE *)estr7h,1);
{flag2=true; }

else //reverse over now turn

if(correction2!i=0) //extreme

if(yaw160) //160+200=360

OutputDebugString(" cor rit 180\n");
estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=201;
Gator.Write((BYTE *)estr7h,1);

else //completed so exit to alleyway mode










OutputDebugString(" cor exit");

headlandflag=false;
pathend=false;
estr7h[0]=100;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);


else //regular

i flyaw<(he adl andangl e+ 1 60))

OutputDebugString("reg rit 180\n");
estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=201;
Gator.Write((BYTE *)estr7h,1);

else

OutputDebugString("reg exit");
headlandflag=false;
pathend=false;
estr7h[0]=100;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);


} //rev over


estr6h[0]='m'; //end of transmission data
Gator.Write((BYTE *)estr6h, 1);










APPENDIX G
CODE SEG1VENT FOR OPEN FIELD NAVIGATION AND OBSTACLE DETECTION

//Code for GPS based open field navigation

// Read the way point coordinates
void ReadGpsCoord()

GP Sdi stanc e=2;
printflag=false;
Steer=200;

//open file and read GPS coordinates for navigation
fpGPS = fopen("gps.txt", "r+");
fpGPSdebug = fopen("gpsdebug.txt", "w+");

if(fpGPS==N~ULL)

OutputDebugString("Error: can't open file.\n");

else

OutputDebugString("gps coordinates file opened\n");
i= 0;

while(! feof(fpGP S))

i++;
/* loop through and store the coordinates into the array */
fscanf(fpGPS, "%lf",&numbers[i-1]i);
n=(i-1)/2;


felose(fpGP S);
OutputDebugString("gps coordinates file closed\n");
sprintf((char*) gpstext, "%d,%d\n",i-1,n);
OutputD ebug Stri ng(gp stext);



void GPSnavigate()


if(i<=2*n)// Loop until all way points are reached

if(GP Sdi stance<1)//reached current coordinate










i=i+ 2;
OutputDebug String("next coord\n");
}//go to next coordinate

if(i>2*n)//finished all

sprintf((char*) gpstext, "%d,%d\n",i,n);
OutputD ebug Stri ng(gp stext);
OutputDebugString("GPS navigation completed\n");
estr4[0]='k'; // start of transmission data
Gator.Write((BYTE *)estr4,1);

//Center the steering before quitting
estr5[0]=100; //hi
Gator.Write((BYTE *)estr5,1);
estr5 [0]=0;//lo
Gator.Write((BYTE *)estr5,1);

//end of transmission data
estr4[0]='m';
Gator.Write((BYTE *)estr4,1);

//quit navigation
estr4[0]='q';
Gator.Write((BYTE *)estr4,1);

else

dlat=numbers[i-1 ]-lati;
dl on-numb ers [i]i-l ongi;
a=sin((dlat/2)*pi/1 80)* sin((dlat/2)*pi/1 80)+cos(lati~pi/1 80)*cos(
numb ers [i 1 ]*pi/1 80)*sin((dlon/2)*pi/1 80)*sin((dlon/2)*pi/1 80);
c=2*atan2(sqrt(a), sqrt(1-a));

GPSdistance=6378 140*c;
GPSbearing=fmod(atan2(sin(dlon~pi/1 80)*cos(numbers[i-
1]*pi/1 80),cos(lati~pi/180)*sin(numbers[i-1 ]*pi/180)-
sin(lati~pi/1 80)*cos(numbers[i-l ] *pi/1 80)*cos(dlon~pi/1 80)), 2*pi);

// Shift bearing from (0 to pi & -pi to 0) to 0 to 2pi
GP Sbearing-pi -GP Sb hearing;
// this shift helps in matching the bearings range to yaw range values from IMU

GP SbearingD=GP Sbearing* 180/pi; //convert gps bearing to degrees
//start steering









//0.33 = 30/90 for steering range = steer angle of 0-30 for a yaw range of 0-90

//Determine steering from the bearing and yaw

i flyaw>GP Sb eari ngD)

if(yaw-GPSbearingD>1 80)

steerangle=(GPSbearingD+360-yaw)*0.333; //6.29=2pi

else

steerangle=(GPSbearingD-yaw)*0.333 ;



else if(yaw
if(GPSbearingD-yaw>1 80)

steerangle=-(yaw+360-GPSbearingD)*0.333 ;

else

steerangle=(GPSbearingD-yaw)*0.333 ;



if(steerangle>3 0)

steerangle=3 0;

else if(steerangle<-30)

steerangle=-30;


steer-(steerangle+3 0)*6.66; //400/60=6.66 convert Oto60 to Oto400
//steering is in opposite direction?
//steer-400-steer;
//steer-200+steer;

//Median filter data to reduce noise 20 data
for(int i=20;i>=1;i--)
arrg[i]=arrg[i-1]; // move data in he array after removing the oldest









arrg[0]=steer; // get the new data in the array


for(int i=0;i<=20;i++)
arrg l [i]= arrg [i]; //make a copy of the array

for(int j =0;j <=21;j ++) // sort more than half the data,
median is already found

for(int i=0;i<20-j;i++)

if(arrg l [i]>arrg l [i+1]) //sort data

tempg= arrg l [i+1];
arrg l [i+1]= arrg l[i];
arrg l [i]=tempg;




steer-arrg l [10];
steer-200;

// Send steering data to PC104 computer
estr4[0]='k'; // start of transmission data
Gator.Write((BYTE *)estr4,1);

estr5 [0]= steer/2; //hi
Gator.Write((BYTE *)estr5,1);
estr5 [0]=0;//lo
Gator.Write((BYTE *)estr5,1);

estr4[0]='m'; //end of transmission data
Gator.Write((BYTE *)estr4,1);




voi d LadarOb stacl eDetecti on()

ob stacl e_si ghte d=fal se;
ob stacl efl ag=fal se;
for(int i=0;i< SIC.smy;i ++)

angle= (float)i/2 ; // for finding x_dist and y_dist
y_dist[i]= fabs(SIC.MVA[i]*sin(angle*3.1428/180));
// frontal distance coordinate










//initialize ground distance to ladar
if(ob stgrnd==0)


for(int i=220;i>140;i--) / scan

if(y_di st[i]i
//infer found obstacle
ob stacl e_sighte d=tru e;

//Check if the inference is true
//Check if 5 succesive points agree with the inference
for(int k= 1;k<= 5;k++)

if(y_di st[i -k]>ob stgrnd- 50)

ob stacl e_sighted=false; //di sagree
break;



//If obstacle found, stop checking left
i f(ob stacl e_si ghte d==true)//de cided

ob stacl efl ag=true;
estr4 [0]='k'; // start of transmission data
Gator.Write((BYTE *)estr4,1);
estr4[0]=0;
Gator.Write((BYTE *)estr4,1);
estr4[0]=204;
Gator.Write((BYTE *)estr4,1);
estr4[0]='m'; //end of transmission data
Gator.Write((BYTE *)estr4,1);
break;










LIST OF REFERENCES


Abdelnour, G., S. Chand, and S. Chiu. 1993. Applying fuzzy logic to the Kalman filter
divergence problem. In: Proc. International Conference on Systems, Man and Cybernetics,
Vol. 1: 630-635.

Ahamed, T., T. Takigawa, M. Koike, T. Honma, A. Yoda, H. Hasegawa, P. Junyusen, and Q.
Zhang. 2004. Characterization of laser range finder for in-field navigation of autonomous
tractor. In: Proc. Automation Technology for Off-Road Equipment, Kyoto, Japan. Paper
No. 701P1004.

Benson, E.R, J.F. Reid, and Q. Zhang. 2001. Machine vision based steering system for
agricultural combines. ASAE Paper No. 011159, St. Joseph, Mich.: ASABE.

Benson, E. R., J. F. Reid, and Q. Zhang. 2003. Machine vision based guidance system for an
agricultural small-grain harvester. Transactions of the ASAE, Vol. 46(4): 1255-1264.

Blackmore, S., H. Have and S. Fountas. 2002. A specification of behavioral requirements for an
autonomous tractor. In: Proc. Automation Technology for Off-Road Equipment, Chicago,
Illinois. Pp. 033-042.

Brown, G.K.,2002. Mechanical harvesting systems for the Florida citrus juice industry, ASAE
Paper No.021108, St. Joseph, Mich.: ASABE.

Burks, T.F., V. Subramanian, and S. Singh.2004. Autonomous greenhouse sprayer vehicle using
machine vision and ladar for steering control. In: Proc. Automation Technology for Off-
Road Equipment, Kyoto, Japan. Paper No. 701P1004.
Carmer, D.C. and L.M. Peterson. 1996. Laser radar in robotics. In: Proceedings of the IEEE Vol.
84(2).

Coulter, R.C. 1992. Implementation of the pure pursuit path tracking algorithm. CMU-RI-TR-
92-01, Robotics Institute, Carnegie Mellon University, Pittsburg, Pennsylvania.

Davis, I.L., and A. Stentz. 1995. Sensor fusion for autonomous outdoor navigation using neural
networks. In: Proc. IEEE/RSJ International Conference on Intelligent Robots and Systems,
Vol. 3: 338-343.

Enomoto, K., T. Yamasaki, H. Takano and Y. Baba, 2007. Automatic following for UAVs using
dynamic inversion. In: Proc. SICE Annual Conf. Pp. 2240-2246.

Fitzgerald R. 1971. Divergence of the Kalman filter. IEEE Transactions on Automatic Control
Vol. 16(6): 736-747.

Gage, A., and R. R. Murphy. 2000. Sensor allocation for behavioral sensor fusion using min-
conflict with happiness. In: Proc. IEEE/RSJ International Conference on Intelligent Robots
and Systems, Vol. 3: 1611-1618.










Garcia-Ortiz, A., 1993. Adaptive vehicle navigation in hostile environments. Canadian
Conference on Electrical and Computer Engineering, Vol.1: 111 114.

Gerrish, J.B., B.W. Fehr, G.R. Van Ee, and D.P. Welch. 1997. Self steering tractor guided by
computer vision. Applied Engineering in Agriculture, ASAE, Vol. 13(5).

Gordon, G.P., and R.G. Holmes. 1988. Laser Positioning system for off-road vehicles. ASAE
Paper No. 88-1603, St. Joseph, Mich.: ASABE.

Hague, T., J. Marchant and N. Tillet ,1997. Autonomous robot navigation for precision
horticulture. In: Proc. IEEE Conf. Robotics & Automation, NM.

Han, S., Q. Zhang, and H. Noh. 2002. Kalman filtering of DGPS positions for a parallel tracking
application. Transactions of the ASAE 45(3): 553-559.

Hansen. A.C., Q. Zhang, T. Wilcox, and R.H. Hornbaker, 2005. Modeling and analysis of row
crop harvesting patterns by combines. ASAE Paper No. 053051, St. Joseph, Mich.:
ASABE.

Han, S, M. A. Dickson, B. Ni, J. F. Reid, and Q. Zhang. 2002. A robust procedure to obtain a
guidance directrix for vision based vehicle guidance systems. In: Proc. Automation
Technology for Off-Road Equipment, Chicago, Illinois. Paper 701P0509 pp. 317-326.

Hodges, A., E. Philippakos, D. Mulkey, T. Spreen, and R. Muraro, 2001. Economic impact of
Florida' s citrus industry, 1999-2000. Extension Digital Information Source (EDIS) FE307.
Gainesville, Fla.: University of Florida, Department of Food and Resource Economics.

lida, M., and T. F. Burks. 2002. Ultrasonic sensor development for automatic steering control of
orchard tractor. In: Proc. Automation Technology for Off-Road Equipment, Chicago,
Illinois. pp 221-229.

Kise, M., N. Noguchi, K. Ishii and H. Terao,2002.Enhancement of turning accuracy by path
planning for robot tractor. In: Proc. Automation Technology for Off-road Equipment,
ASAE paper No. 701P0502.

Klein, L.A. (Eds.), 1999. Sensor and data fusion concepts and applications. WA, USA .: SPIE
Press.

Kobayashi, K., K. C. Cheok, K. Watanabe and F. Munekata. 1998. Accurate differential global
positioning system via fuzzy logic Kalman filter sensor fusion technique. IEEE
Transactions on Industrial Electronics Vol. 45(3): 510-518.

Kodagoda, K.R.S.,W.S. Wijesoma, and E.K. Teoh. 2002. Fuzzy speed and steering control of an
AGV. IEEE transactions on Control Systems Technology, Vol. 10(1) Pp. 112-120.










Matsuo, Y., S. Yamamoto, and O. Yukumoto. 2002. Development of tilling robot and operation
software. In: Proc. Automation Technology for Off-road Equipment, Chicago, Illinois.
Ppl84-189.

Miller, M., B. Steward and M. Westphalen, 2004. Effects of multi-mode four-wheel steering on
sprayer machine performance. Transactions of ASAE. Vol. 47(2): 385-395.

Mizushima, A., N. Noguchi, and K. Ishii, and H. Terao. 2002. Automatic navigation of the
agricultural vehicle by the geomagnetic direction sensor and gyroscope. In: Proc.
Automation Technology for Off-road Equipment, Chicago, Illinois. Pp.204-211.

Mobus, R., and U. Kolbe. 2004. Multi-target multi-obj ect tracking, sensor fusion of radar and
infrared. In: Proc. IEEE Intelligent Vehicles Symposium, 732-737.

Morimoto, E., M. Suguri, M. Umeda. 2002. Obstacle avoidance system for autonomous
transportation vehicle based on image processing. Agricultural Engineering International.
CIGR Journal of Scientific Research and Development, Vol 4. Pp 146-154.

Murakami, N., K. Inoue, and S. Miyaura. 2004.Teleoperation platform for HST drive agricultural
vehicle. In: Proc. Automation Technology for Off-road Equipment, Kyoto, Japan. Paper
701P1004.

Nagasaka,Y., N. Umeda, and Y. Kanetai. 2002. Automated rice transplanter with GPS and FOG.
In: Proc. Automation Technology for Off-road Equipment, Chicago, Illinois. pp 190-195.

Nagasaka, Y., N. Umeda, Y. Kanetai, K. Tanikawi, and Y. Sasaki. 2004. Automated rice
transplanter using global positioning and gyroscopes. Computers and Electronics in
Agriculture 43(3): 223-234.

Nijhuis, J.,S. Neuber, J. Heller, and J. Sponnemann. 1992. Evaluation of fuzzy and neural vehicle
control. CompEuro '92, Proc. Computer Systems and Software Engineering, pp 447 452.

Nishiwaki, K., T. Tatsushi, and K. Amaha. 2002. Vision based speed and yaw angle
measurement system. In: Proc. Automation Technology for Off-road Equipment, Chicago,
Illinois. pp 212-220.

Noguchi, N., M. Kise, K. Ishii, and H. Terao. 2002. Field automation using robot tractor. In:
Proc. Automation Technology for Off-road Equipment, Chicago, Illinois.pp 239-245.

Noguchi, N., J.F. Reid, Q. Zhang and J. Will, 2001. Turning function for robot tractor based on
spline function. ASAE Paper No. 01-1196, St.Joseph, Mich.: ASABE.

Oksanen, T., and A. Visala, 2004. Optimal control of tractor-trailer system in headlands. In:
Proc. Automation Tech. for off-road equipment. Kyoto, Japan.










Passino, K.M, and S. Yurkvovich. 1998. Fuzzy Control. California, USA.: Addison Wesley
Longman, Inc.

Paul A. S., and E. A. Wan. 2005. Dual Kalman filters for autonomous terrain aided navigation in
unknown environments. In: Proc. IEEE International Joint Conference on Neural
Networks, 5: 2784-2789.

Petrinec, K., Z. Kovacic and A. Marozin, 2003. Simulator of multi-AGV robotic industrial
environments. In: Proc. IEEE Inter. Conf. on Industrial Tech. Vol. 2: 979-983.

Qiu, H., Q. Zhang, J.F. Reid, and D. Wu. 2001. Modelling and simulation of an electrohydraulic
steering system. International Journal of Vehicle Design, Vol. 26, pp. 161-174.

Rasmussen, C. 2002. Combining laser range, color, and texture cues for autonomous road
following. In: Proc. IEEE International Conference on Robotics and Automation, Vol. 4:
4320-4325.

Reid, J.F. 2004. Mobile intelligent equipment for off-road environments, In: Proc. Automation
Technology for Off-road Equipment, Kyoto, Japan.

Rouse, B., and S. Futch, 2004, Start Now to Design Citrus Groves for Mechanical Harvesting,
HS974, IFAS Extension, University of Florida, Gainesville, FL,.

Rovira-Mas, F., Q. Zhang, J. F. Reid and J. D. Will. 2002. Machine vision row crop detection
using blob analysis and the hough transform. In: Proc. Automation Technology for Off-
Road Equipment, Chicago, Illinois. Paper 701P0509 pp. 327-336.

Runkler, T., M. Sturm and H. Hellendoorn. 1998. Model based sensor fusion with fuzzy
clustering. In: Proc. IEEE International Conference on Fuzzy Systems. 2: 1377-1382.

Sasladek, J.Z, and Q. Wang. 1999. Sensor fusion based on fuzzy Kalman filtering for
autonomous robot vehicle. In: Proc. of IEEE International Conference on Robotics and
Automation, 4: 2970-2975.

Senoo, S., M. Minor, and S. Funabiki. 1992. Steering control of automated guided vehicle for
steering energy saving by fuzzy reasoning. In: Proc. IEEE Industry Applications Society,
Vol. 2, pp 1712 -1716.

Shin, B.S.,S.H. Kim, and J.U. Park. 2002. Autonomous agricultural vehicle using overhead
guide. In: Proc. Automation Technology for Off-Road Equipment. Chicago, Illinois. Paper
701P0509 Pp. 261-269.

Stombaugh, T.S., E.R. Benson, and J.W. Hummel. 1999. Guidance control agricultural vehicles
at high field speeds. Transactions of the ASAE. Vol. 42(2).










Subramanian, V., T. F. Burks, and S. Singh. 2005. Autonomous greenhouse sprayer vehicle
using machine vision and laser radar for steering control. Applied Engineering in
Agriculture 21(5): 93 5-943.

Subramanian, V., T. F. Burks, and A. A. Arroyo. 2006. Development of machine vision and laser
radar based autonomous vehicle guidance systems for citrus grove navigation. Computers
and Electronics in Agriculture 53(2): 130-143.

Takahashi, T., S. Zhang, and H. Fukuchi .2002. Acquisition of 3-D information by binocular
stereo vision for vehicle navigation through an orchard. In: Proc. Automation Technology
for Off-Road Equipment, Chicago, Illinois.Paper 701P0509 pp. 337-346.

Tsubota, R., N. Noguchi, and A. Mizushima. 2004. Automatic guidance with a laser scanner for
a robot tractor in an orchard. In: Proc. Automation Technology for Off-road Equipment,
Kyoto, Japan. Paper701P1004.

Wu, H., M. Siegel, R. Stiefelhagen, and J. Yang. 2002. Sensor fusion using dempster Schafer
theory. In: Proc. IEEE Instrumentation and Measurement Conference, 1:7-12.

Yekutieli, O., and F. G. Pegna. 2002. Automatic guidance of a tractor in a vineyard. In: Proc.
Automation Technology for Off-Road Equipment, Chicago, Illinois.Paper 701P0502. Pp.
252-260.

Yokota,M.,, A. Mizushima, K. Ishii, and N. Noguchi 2004. 3-D map generation by a robot
tractor equipped with a laser range finder. In: Proc. Automation Technology for Off-road
Equipment, Kyoto, Japan.Paper 701P1004.

Zarchan, P., and H. Musoff. (Eds.), 2005. Fundamentals of Kalman filtering: A practical
approach. Progress in Astronautics and Aeronautics, 208. Virginia, USA, AIAA 2.

Zhang, Q., J.F. Reid, and N. Noguchi. 2000. Automatic guidance control for agricultural tractor
using redundant sensor. SAE Transactions, Journal of Commercial Vehicles, Vol. 108, pp
27-31.

Zhang, Q.,S. Cetinkunt, T. Hwang, Pinsopon, M. A. Cobo, and R. G. Ingram. 2001. Use of
adaptive control algorithms for automatic calibration of electrohydraulic actuator control.
Applied Engineering in Agriculture, Vol. 17(3).

Zhang, Q., and H. Qiu. 2004. Dynamic path search algorithm for tractor automatic navigation.
Transactions of the ASAE, Vol. 47(2): 639-646.









BIOGRAPHICAL SKETCH

Vijay Subramanian graduated high school from Kendriya Vidyalaya in 1998. He received

his Bachelor of Engineering degree in electrical and electronics engineering from Annamalai

University in 2002. He received the Master of Science degree in electrical and computer

engineering in 2004 and the Master of Science degree in agricultural and biological engineering,

specializing in robotics, in 2005, both from the University of Florida.





PAGE 1

AUTONOMOUS VEHICLE GUIDANCE FOR CITRUS GROVE NAVIGATION By VIJAY SUBRAMANIAN A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF DOCTOR OF PHILOSOPHY UNIVERSITY OF FLORIDA 2008 1

PAGE 2

2008 Vijay Subramanian 2

PAGE 3

To Mankind 3

PAGE 4

ACKNOWLEDGMENTS I thank my advisor and dissertation chair Dr. Thomas Burks for helping me successfully complete all my graduate studies at the University of Florida. His wisdom and advice has helped me sail smoothly through the graduate studies at UF. I am thankful to Dr. Arroyo, Dr. Lee, Dr. Dixon, Dr. Slaton and Dr. Moskow for being a part of my dissertation committee. Their ideas and insights have been valuable. I am grateful to Dr. Arroyo for bringing robotics from the books to my hands through his courses. I thank Greg Pugh for his help in building the electronic boards and in writing the device drivers. But for him, this work would have taken a lot more time to complete. I thank Mike Zingaro for building the pretty mounts on the vehicle and patiently helping collect data in the field during the summer months in Florida. I thank the graduate committee of the Department of Agricultural and Biological Engineering for providing the flexibility in taking courses from other departments. This has been very helpful for me in taking many courses that I like in other departments. I am thankful to the graduate students and alumni of this department for sharing their insights into getting through graduate research. I thank all the people who have influenced this work. 4

PAGE 5

TABLE OF CONTENTS page ACKNOWLEDGMENTS...............................................................................................................4 LIST OF TABLES.........................................................................................................................10 LIST OF FIGURES.......................................................................................................................11 ABSTRACT...................................................................................................................................18 CHAPTER 1 INTRODUCTION..................................................................................................................20 Florida Citrus..........................................................................................................................20 Automation and Robotics in Citrus Groves............................................................................20 Autonomous Vehicles for Citrus Groves................................................................................23 Objectives...............................................................................................................................25 2 LITERATURE REVIEW.......................................................................................................26 Agricultural Autonomous Navigation Applications...............................................................26 Guidance Vehicle............................................................................................................26 Guidance by Remote Control..........................................................................................27 Direct Guidance...............................................................................................................28 Guidance Using Arm.......................................................................................................28 Overhead Guidance.........................................................................................................29 Guidance on Tracks and Leader Cables..........................................................................30 Distance Metering and Triangulation Based Guidance...................................................30 Guidance by Dead Reckoning.........................................................................................31 Guidance Using Ultrasonic Sensors................................................................................31 Wireless/Sensor Networks...............................................................................................32 Guidance Using Inertial Sensors.....................................................................................32 GPS Based Guidance Systems........................................................................................33 Machine Vision for Autonomous Navigation.........................................................................34 Laser Radar (Ladar) for Autonomous Navigation..................................................................37 Intelligence for Autonomous Navigation...............................................................................38 Vehicle Dynamics Modeling and Steering Control................................................................39 Vehicle Modeling............................................................................................................39 Steering Control...............................................................................................................41 PLC control..............................................................................................................42 ON/OFF control.......................................................................................................42 PID control...............................................................................................................42 Adaptive control.......................................................................................................43 Fuzzy logic...............................................................................................................43 Neural networks.......................................................................................................44 5

PAGE 6

Behavior-based control............................................................................................44 Optimal control........................................................................................................44 Non-linear control....................................................................................................45 Sensor Fusion Applications in Navigation.............................................................................45 Pure Pursuit Steering Control.................................................................................................47 Headland Turning Applications..............................................................................................49 Path Planning..........................................................................................................................50 Current Commercial Systems.................................................................................................51 3 EXPERIMENTAL METHODS.............................................................................................52 Sensors, Control and Hydraulic Architecture of the Tractor..................................................52 Guidance System Hardware Architecture.......................................................................52 Vehicle.............................................................................................................................53 Autonomous Steering......................................................................................................55 Servo Valve.....................................................................................................................58 Guidance Sensor: Camera...............................................................................................61 Frame Grabber Board......................................................................................................62 Guidance Sensor: Laser Radar........................................................................................63 Encoder............................................................................................................................64 Computer.........................................................................................................................66 Microcontroller................................................................................................................67 Amplifier Circuit.............................................................................................................68 DGPS Receiver................................................................................................................69 Power Supply...................................................................................................................69 Inertial Measurement Unit...............................................................................................70 Speed Sensor...................................................................................................................71 Experimental Approaches and Theoretical Development......................................................71 Vehicle Dynamics, Calibration and Steering Control of Tractor....................................71 Valve calibration......................................................................................................71 Encoder calibration..................................................................................................73 Vehicle dynamics.....................................................................................................74 Control system..........................................................................................................79 Vision and Ladar Independent Steering..........................................................................83 Vision for path with hay bales..................................................................................85 Algorithm flowchart.................................................................................................91 Vision for citrus grove..............................................................................................92 Laser radar (ladar) for path detection.....................................................................101 Serial communication between PC and microcontroller........................................104 Operating speed of the guidance system................................................................105 Sensor Fusion Based Autonomous Steering..................................................................106 Performance of independent sensor based autonomous steering...........................106 Need for sensor fusion based autonomous steering...............................................107 Kalman filter for autonomous guidance.................................................................108 Reliability factor of primary guidance sensors in the Kalman filter and fuzzy logic sensor supervisor.......................................................................................115 Divergence detection and fuzzy logic correction...................................................122 6

PAGE 7

Sample simulation result of the sensor fusion method...........................................126 Headland Turning and Navigation................................................................................128 Pure pursuit steering...............................................................................................129 Headland turning maneuvers..................................................................................134 Open Field Navigation and Obstacle Detection............................................................147 Open field navigation.............................................................................................148 Obstacle detection..................................................................................................154 4 SOFTWARE DEVELOPMENT..........................................................................................159 Introduction...........................................................................................................................159 Software Architecture for Tractor Guidance........................................................................159 PID Based Steering Control..................................................................................................161 Vision and Ladar Based Path Segmentation.........................................................................162 Sensor Fusion Based Navigation..........................................................................................164 Software Architecture for eGator Guidance.........................................................................165 Pure Pursuit Steering............................................................................................................167 Headland Turning.................................................................................................................167 Open Field Navigation and Obstacle Detection...................................................................169 Open Field Navigation..................................................................................................169 Obstacle Detection.........................................................................................................170 5 DEVELOPMENT OF MACHINE VISION AND LASER RADAR BASED AUTONOMOUS VEHICLE GUIDANCE SYSTEMS FOR CITRUS GROVE NAVIGATION.....................................................................................................................171 Introduction...........................................................................................................................171 Materials and Methods.........................................................................................................174 Experimental Procedure........................................................................................................183 Results and Discussion.........................................................................................................186 Conclusions...........................................................................................................................191 6 SENSOR FUSION USING FUZZY LOGIC ENHANCED KALMAN FILTER FOR AUTONOMOUS VEHICLE GUIDANCE..........................................................................192 Introduction...........................................................................................................................192 Materials and Methods.........................................................................................................196 Kalman Filter.................................................................................................................198 State transition model.............................................................................................198 Measurement model...............................................................................................200 Filter gain...............................................................................................................201 Reliability Factor of Primary Guidance Sensors in the Kalman Filter and Fuzzy Logic Sensor Supervisor............................................................................................203 Divergence Detection and Fuzzy Logic Correction......................................................205 Simulation......................................................................................................................208 Experimental Procedure........................................................................................................209 Results and Discussion.........................................................................................................212 7

PAGE 8

Conclusions...........................................................................................................................216 7 HEADLAND TURNING MANEUVER OF AN AUTONOMOUS VEHICLE NAVIGATING A CITRUS GROVE USING MACHINE VISION AND SWEEPING LADAR.................................................................................................................................218 Introduction...........................................................................................................................218 Materials and Methods.........................................................................................................218 Pure Pursuit Steering.....................................................................................................219 Headland Turning Maneuvers.......................................................................................224 Determining the approach of the headland.............................................................224 Precisely establishing the end of the row...............................................................228 Drawback of the vision based headland detection.................................................228 Precisely establishing end of the row using sweeping ladar..................................229 Vision and ladar based headland determination.....................................................235 Navigating the headland and entering next row.....................................................236 Experimental Procedures......................................................................................................238 Experiments in the Test Track.......................................................................................238 Experiments in the Citrus Grove...................................................................................240 Results and Discussion.........................................................................................................242 Conclusions...........................................................................................................................245 8 OPEN FIELD NAVIGATION AND OBSTACLE DETECTION......................................246 Introduction...........................................................................................................................246 Materials and Methods.........................................................................................................247 Open Field Navigation..................................................................................................247 Obstacle Detection.........................................................................................................253 Experimental Methods..........................................................................................................257 GPS Based Navigation..................................................................................................257 Obstacle Detection.........................................................................................................259 Results and Discussion.........................................................................................................260 GPS Based Navigation..................................................................................................260 Obstacle Detection.........................................................................................................261 Conclusions...........................................................................................................................262 9 RESEARCH SYNTHESIS AND OVERVIEW...................................................................263 10 CONCLUSION AND FUTURE WORK.............................................................................267 Conclusions...........................................................................................................................267 Future Work..........................................................................................................................268 APPENDIX A CODE SEGMENT FOR PID BASED STEERING CONTROL.........................................270 B CODE SEGMENT FOR VISION BASED PATH SEGMENTATION..............................276 8

PAGE 9

C CODE SEGMENT FOR LADAR BASED PATH NAVIGATION....................................284 D CODE SEGMENT FOR SENSOR FUSION BASED ALGORITHMS..............................287 E CODE SEGMENT FOR PURE PURSUIT STEERING......................................................290 F CODE SEGMENT FOR HEADLAND TURNING.............................................................291 G CODE SEGMENT FOR OPEN FIELD NAVIGATION AND OBSTACLE DETECTION........................................................................................................................304 LIST OF REFERENCES.............................................................................................................309 BIOGRAPHICAL SKETCH.......................................................................................................314 9

PAGE 10

LIST OF TABLES Table page 3-1 Specifications of the electro-hydraulic valve.....................................................................59 3-2 Open loop frequency response tests at various speeds and frequencies............................75 3-3 Performance criteria of the controller (Simulated results).................................................83 3-4 Fuzzy logic rule set for sensor supervisor........................................................................119 3-5 Fuzzy logic rule set for divergence correction.................................................................125 3-6 Calibration of radius of curvature of vehicle with steering angle....................................132 5-1 Performance measures of the vehicles guidance system obtained from the experiments conducted in the straight test path...............................................................187 5-2 Performance measures of the vehicles guidance system obtained from the experiments conducted in the curved test path at 3.1m/s.................................................190 6-1 Fuzzy logic rule set for sensor supervisor........................................................................205 6-2 Fuzzy logic rule set for divergence correction.................................................................207 6-4 Performance measures obtained from the experiments conducted in grove alleyway....215 7-1 Calibration of radius of curvature of vehicle with steering angle....................................222 7-2 Performance measures of the eGator based guidance system for path navigation..........242 7-3 U-turn performance measures..........................................................................................244 7-4 Switch back turn performance measures.........................................................................244 8-1 Error of the vehicle for GPS based navigation experiments............................................260 10

PAGE 11

LIST OF FIGURES Figure page 1-1 Agricultural Robots............................................................................................................22 1-2 Typical citrus grove alleyway............................................................................................24 2-1 Remote controlled vehicle (Murakami et al., 2004)..........................................................27 2-2 Sensor arm configurations used to guide a crawler tractor (Yekuteli et al., 2002)............29 2-3 Overhead Guidance (Shin et al., 2002)..............................................................................29 2-4 Greenhouse sprayer with ultrasonic sensor (Singh and Burks. 2002)...............................32 2-5 Machine vision guidance example (Benson et al., 2001)..................................................35 2-6 Machine vision guidance example (Rovira-Mas et al., 2002)...........................................36 2-7 Ladar Mounted on top of the tractor (Yokota et al., 2004)................................................38 2-8 UAV using pure pursuit (Enomoto et al., 2007)................................................................48 2-9 Switch back turning implemented using spline functions (Noguchi et al., 2001).............50 3-1 Architecture of the tractor guidance system......................................................................53 3-2 John Deere 6410 tractor used for autonomous vehicle guidance development.................54 3-3 Hydrostatic steering mechanism of the tractor..................................................................55 3-4 Electro-hydraulic circuit for autonomous steering............................................................57 3-5 Rear end of tractor showing the supply to the servo valve................................................57 3-6 View under the cab showing the servo valve circuit.........................................................58 3-7 Servo valve showing the main parts (Source: Sauer Danfoss PVG 32 valve documentation)..................................................................................................................60 3-8 Video Camera, one of the primary path finding sensors...................................................61 3-9 Camera mounted on top of the tractor cab.........................................................................62 3-10 Frame grabber board..........................................................................................................63 3-11 Laser radar and its mount...................................................................................................64 11

PAGE 12

3-12 Stegmann HD20 Encoder..................................................................................................64 3-13 Output from the encoder shows two waves in quadrature.................................................65 3-14 Encoder mounted on the tractor axle.................................................................................65 3-15 Instruments mounted in the tractor cabin...........................................................................66 3-16 Microcontroller..................................................................................................................67 3-17 Amplifier Circuit................................................................................................................68 3-18 DGPS receiver mounted on top of the tractor cab.............................................................69 3-19 Inverter mounted in the tractor cabin.................................................................................70 3-20 Inertial Measurement Unit used in this research................................................................70 3-21 Voltage and time required to turn 100 degrees..................................................................72 3-22 Encoder calibration............................................................................................................73 3-23 Vehicle position plotted in ArcView (0.2Hz, 10mph).......................................................75 3-24 Measuring gain from the position plot...............................................................................76 3-25 Open loop frequency response for 4 mph, 7 mph and 10 mph..........................................76 3-26 Theoretical model for the three speeds..............................................................................78 3-27 Simulink model for the vehicle dynamics.........................................................................78 3-28 Sinusoidal response of the theoretical model and the vehicle (10 mph, 0.2 Hz)...............79 3-29 Guidance control block diagram........................................................................................79 3-30 Simulink model of the vehicle control system...................................................................80 3-31 Simulated sustained oscillation with a gain of 1.8 for the 10 mph model.........................81 3-32 Simulated step response with the initial tuning parameters (10mph, Kp=1.08, Kd = 1.25)...................................................................................................................................82 3-33 Simulated step response with the final tuned parameters (10mph)...................................83 3-34 Camera and Ladar mounted on top of the tractor cabin.....................................................84 3-35 Typical images seen while navigating through hay bales..................................................85 3-36 Image Coordinates.............................................................................................................86 12

PAGE 13

3-37 R, G, B color plots of hay bales (X axis: Intensity-Range: 0-255, Y axis: no. of pixels).................................................................................................................................87 3-38 R, G, B color plots of grass path (X axis: Intensity-Range: 0-255, Y axis: no. of pixels).................................................................................................................................87 3-39 Thresholded image.............................................................................................................88 3-40 Cleaned image....................................................................................................................88 3-41 Boundary Isolated image...................................................................................................89 3-42 Lines fit to the boundaries..................................................................................................90 3-43 Machine vision algorithm flowchart..................................................................................92 3-44 Typical images for navigation in citrus grove...................................................................93 3-45 Picture collection in the grove...........................................................................................94 3-46 R, G ,B intensity plot for ground and trees in grove alleyway..........................................95 3-47 Images and their green intensity profile across the center of the image............................96 3-48 Thresholding. A)Shadow image........................................................................................97 3-49 Classified images without and with shadows....................................................................98 3-50 Machine vision algorithm flowchart................................................................................100 3-51 Calculating the required heading of the vehicle..............................................................101 3-52 Radial distance plotted against angle (180 degree at 0.5 degree increments).................102 3-53 Ladar algorithm flowchart...............................................................................................104 3-54 Serial Protocol..................................................................................................................105 3-55 Illustration of Kalman filter operation.............................................................................108 3-56 Vehicle in the path with the state vector variables..........................................................110 3-57 Fuzzy logic implementation.............................................................................................116 3-58 Membership function for linguistic variables..................................................................117 3-59 Crisp output membership function...................................................................................120 3-60 Illustration of inference for the example..........................................................................121 13

PAGE 14

3-61 Fuzzy logic implementation to correct divergence..........................................................124 3-62 Input membership function..............................................................................................124 3-63 Output membership function...........................................................................................126 3-64 Simulation result of fusion of error obtained from vision and ladar algorithms.............127 3-65 Pure Pursuit steering method...........................................................................................130 3-66 Illustration of experiment for steering angle calibration with radius of curvature..........131 3-67 Line fitting for calibrating radius of curvature of vehicle with steering angle................132 3-68 Flowchart of Pure Pursuit steering algorithm..................................................................133 3-69 Image obtained from the camera when the vehicle is navigating the alleyway...............134 3-70 Segmented images...........................................................................................................135 3-71 Vision algorithm example for headland detection...........................................................136 3-72 Use of boxes in the segmented images............................................................................137 3-73 Image where the vision based headland detection can fail..............................................139 3-74 Ladar and motor assembly for sweeping.........................................................................140 3-75 Illustration of the ladar sweep..........................................................................................140 3-77 Swept ladar data in Cartesian coordinates.......................................................................143 3-78 3-dimensional ladar data projected on the horizontal plane............................................144 3-79 Tree clustering and headland determination....................................................................145 3-80 Illustration of vehicle navigating way points...................................................................150 3-81 Angle ranges of steering and vehicle turn........................................................................153 3-82 Path taken for translation of the vehicle to the way point................................................154 3-83 Ladar data when obstacle is not present in front of the vehicle.......................................155 3-84 Ladar scan data with obstacle present in front of the vehicle..........................................156 3-85 Illustration of the obstacle detection using ladar.............................................................156 3-86 Flowchart of the obstacle detection algorithm.................................................................158 14

PAGE 15

4-1 Software architecture of the tractor guidance system......................................................160 4-2 Dialog box for high level controls...................................................................................160 4-3 Software architecture of the eGator guidance system......................................................166 5-1 Electro-hydraulic retrofit of the tractor for automatic guidance......................................175 5-2 Guidance system architecture of the vehicle...................................................................176 5-3 Camera and ladar mounted on top of the tractor cab.......................................................176 5-4 Machine vision algorithm flowchart................................................................................178 5-5 Machine vision results for citrus grove alleyway............................................................179 5-6 Ladar algorithm flowchart...............................................................................................180 5-7 Radial distance measured by the laser radar in the hay bale path....................................181 5-8 Open loop theoretical and actual frequency response of the sinusoidal vehicle dynamics test (Phase = -180 deg)....................................................................................182 5-9 Simulated vehicle control system block diagram............................................................183 5-10 Simulated 1m step response of the vehicle at 3.3 m/s.....................................................183 5-11 Guidance system test path................................................................................................184 5-12 Vehicle in the citrus grove alleyway................................................................................186 5-13 Performance of the machine vision guidance in the straight path...................................188 5-14 Performance of the laser radar guidance in the straight path...........................................189 5-15 Performance in the curved path at 3.1 m/s.......................................................................190 6-1 Fusion based guidance system architecture.....................................................................197 6-2 Vehicle in the path with the state vector variables..........................................................198 6-3 Kalman filter operation....................................................................................................202 6-4 Fuzzy logic for sensor supervisor....................................................................................204 6-5 Fuzzy logic for correcting divergence.............................................................................206 6-6 Simulation result of fusion of error obtained from vision and ladar algorithms.............209 6-7 Hay bale track profile.......................................................................................................210 15

PAGE 16

6-8 Citrus grove alleyways.....................................................................................................211 6-9 Field of View (FOV) of camera and ladar.......................................................................211 6-10 Path navigation error of the vehicle in the test track.......................................................214 6-11 Path navigation error of the vehicle in the grove alleyway.............................................216 7-1 Pure Pursuit steering method...........................................................................................220 7-2 Illustration of experiment for steering angle calibration with radius of curvature..........221 7-3 Calibration curve between radiuses of curvature of vehicle with the steering angle.......222 7-4 Flowchart of Pure Pursuit steering algorithm..................................................................223 7-5 Images obtained from the camera when the vehicle is navigating the alleyway.............224 7-6 Segmented images...........................................................................................................225 7-7 Vision algorithm example for headland detection...........................................................226 7-8 Use of boxes in the segmented images............................................................................227 7-9 Sample image where the vision based headland detection can fail.................................229 7-10 Ladar and motor assembly for sweeping.........................................................................230 7-11 Illustration of the ladar sweep..........................................................................................231 7-12 Coordinate systems..........................................................................................................232 7-13 Swept ladar data in Cartesian coordinates.......................................................................233 7-14 3-dimensional ladar data projected on the horizontal plane............................................234 7-15 Tree clustering and headland determination....................................................................235 7-16 Vehicle in the test track....................................................................................................238 7-18 Alleyway of the citrus grove where experiments were conducted..................................240 7-19 Headland turning maneuvers in the grove.......................................................................241 7-20 Headland turning maneuver performance measures........................................................243 8-1 Illustration of vehicle navigating way points...................................................................249 8-2 Angle ranges of steering and vehicle turn........................................................................251 16

PAGE 17

8-3 Path taken for translation of the vehicle to the way point................................................252 8-4 Ladar data when obstacle is not present in front of the vehicle.......................................254 8-5 Ladar scan data with obstacle present in front of the vehicle..........................................254 8-6 Illustration of the obstacle detection using ladar.............................................................255 8-7 Flowchart of the obstacle detection algorithm.................................................................256 8-8 Waypoint location and vehicles path for the GPS based navigation..............................257 17

PAGE 18

Abstract of Dissertation Presented to the Graduate School of the University of Florida in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy AUTONOMOUS VEHICLE GUIDANCE FOR CITRUS GROVE NAVIGATION By Vijay Subramanian August 2008 Chair: Thomas F Burks Major: Agricultural and Biological Engineering Automation and robotics is becoming an important part of agricultural operations in the 21st century. Exploration is currently under way to use robotics in citrus groves. An important part of robotics in citrus groves is the use of autonomous vehicles. The development of guidance systems for an autonomous vehicle operating in citrus groves is presented in this dissertation. The guidance system development is based on two different vehicles, a commercial Tractor and a golf cart type vehicle, eGator. Initial developments were based on the Tractor and the later developments were based on the eGator. The guidance system development process consisted of choosing sensors for implementing the guidance system, the mounting of the sensors and developing hardware and software interface for the sensors with the computer, the development of algorithms for processing the information from the sensors to determine the path for the vehicle, the development of steering control system to steer the vehicle in the desired path; the development of the software for interfacing the sensors and implementing the algorithms, and conducting experiments to validate the operation of the autonomous vehicle in citrus groves. The primary sensors used for determining the path for the vehicle were, video camera and laser radar. The important methods discussed in this dissertation are the determination of the 18

PAGE 19

vehicle dynamics of the tractor; the development of PID based steering control system for steering the tractor machine vision, machine vision and laser radar based path segmentation algorithms for navigating the vehicle through grove alleyways, sensor fusion based path determination by fusing the information from machine vision and laser radar, Pure Pursuit method of steering the eGator, algorithms for turning the vehicle at the headlands of the citrus grove using sweeping ladar, GPS based navigation of open fields between grove blocks and laser radar based obstacle detection. The ability of the autonomous vehicle to navigate the paths was experimentally verified in custom designed test tracks and in citrus groves. The development of the autonomous vehicle guidance system showed machine vision and ladar based systems as promising and accurate methods for navigating citrus groves. The autonomous vehicle possesses the ability to navigate the citrus grove alleyways in the middle of the path, turn around at the headlands and navigate subsequent alleyways, navigate open fields between grove blocks, and detect and stop in the presence of obstacles. 19

PAGE 20

CHAPTER 1 INTRODUCTION This dissertation is about the development of an autonomous vehicle guidance system and methods for navigating citrus groves. The need for automation in citrus groves is first introduced. The objectives of this work in addressing the need are then specified. A glance of the research done by other researchers in similar areas is given. The details of the autonomous vehicle guidance system and algorithm development are discussed. Finally, the results of the research are presented and the areas of further improvement are indicated. Florida Citrus Citrus is a genus of flowering plants. For the common man, it includes orange, tangerine, grapefruit, lime, lemon and their many varieties. Florida, the place where this work was done, supplies more than 80% of citrus in USA and is today a $9 billion industry (Hodges et al., 2001). There are also other major citrus producers in the world such as Brazil, Mexico, Israel and Italy. For all these producers, there are large numbers of people in the world who consume citrus. The market therefore, for citrus, is huge. Citrus area per farm in Florida was around 0.5 sq. Km in 1997 (Hodges et al., 2001) and is currently the states largest agricultural produce. Florida is also the worlds leading producer of grapefruit and is the second largest producer of oranges in the world following Brazil. 95 percent of all Florida oranges are squeezed into orange juice (Brown, 2002). The total economic impact of the citrus industry on the state of Florida exceeds $8 billion annually. Automation and Robotics in Citrus Groves Over the years, large scale farms have been replacing the smaller ones. More people are moving to other occupations. So, one farm owner replaces several owners. This leads to larger fields to plant and harvest. Laborers often work for several weeks in a farm, so facilities have to 20

PAGE 21

be provided for them to live near the orchard. This requires a good investment on the part of the farm owner. In Florida, non-U.S. citizens are often employed for harvesting citrus, many of whom face immigration problems. Therefore labor is scarce and decreasing. If the trend continues, fewer people will be available to operate in the field. With large farms replacing several smaller ones, the work that is required of the operator is also increased in scale. Longer duration of operation is necessary to cover huge orchards. With longer duration of operation, operator work is large and consistency in precision is sacrificed. The amount of work per person is also expected to be more. It is also known that longer duration of operation can be detrimental to the laborers health. For applications such as spraying, there is a risk of the operator being exposed to the sprayed chemicals. This would affect the operators health. Therefore there is a great need for automation of agricultural practices. A general definition of the term automation is according to the Oxford English dictionary, the use of electronic or mechanical devices to replace human labor. For the automation haters, it must be quickly pointed out that automation is not for rendering people unemployed. Automation is mostly to do the work that is not enjoyable or is dangerous or is difficult for humans. For example, the automatic processing and packaging machines which operate in large factories. There are as many varying definitions of robotics as there are researchers in the field of robotics. A reasonable definition would be an automation possessing enough intelligence to change its behavior in accordance with some level of change in its environment. The level of change is generally not defined. Examples range from the well known walking humanoids to the mars rovers. Today robotics technology provides intelligent systems that have the potential to be used in many agricultural applications. Robotic systems are precise and can be used repeatedly. The state of the art technology can provide machines with precision control and intelligence for 21

PAGE 22

automatic agriculture using autonomous off-road vehicles. The use of robots is more common today than ever before and it is no longer exclusively used by the heavy production industries. The agricultural industry is behind other industries in using robots because the sort of jobs involved in agriculture are not straight forward and many repetitive tasks are not exactly the same every time. In most cases, many factors have to be considered (i.e., size and color of the fruit to be picked) before the commencement of a task. Robotics is usually associated with the manufacturing industry. However certain similarities between the manufacturing industry and agriculture make robotics a good option for agriculture. These factors include decreasing labor, foreign competition, rising costs, hazardous environment for the laborer and the need to invest in technology to improve productivity. On the other hand, there are quite a few differences that make robotics difficult to be used in agriculture. These factors include the variability in the environment, seasonal and changing crops. Robotics is currently being tested in several areas of agriculture, for example, picking fruits in orchards, spraying crops in greenhouses, animal milking and shearing, planting and harvesting crops, food packaging, disease detection, mapping, etc. Fruit picking robots and sheep shearing robots are designed to replace difficult human labor. A B Figure 1-1. Agricultural Robots. A) Fruit picking robot (University of Florida). B) Sheep shearing robot (University of Western Australia). In the 2001-2002 season, about 6000 ha of Floridas citrus groves were mechanically harvested (Brown, 2002). The operations performed in a citrus grove include preparing the soil, 22

PAGE 23

planting the trees, irrigation, spraying fertilizers and pesticides, monitoring the grove and trees, and harvesting the fruits. Autonomous Vehicles for Citrus Groves A major achievement in agricultural robotics would be a vehicle capable of moving through the entire field on its own. The human operator would only be required to supervise the autonomous operation. Since these vehicles can work continuously, they are very efficient. These vehicles could open a new market in agriculture. A project on an autonomous vehicle not only benefits agriculture, but also other areas, such as military operations, rescue operations and use in hazardous environments. Since these vehicles are expected to work in challenging environments independent of the operator, they have to rely on their sensors and navigation system. Sensor electronics and computing facilities must be integrated with the mechanical and hydraulic parts. The present technology in sensors and navigational equipments is advanced enough to realize a vehicle capable of autonomous operation. The economic viability has improved significantly over the last decade. Fully autonomous technologies are often seen as being too expensive to justify their use compared to conventional machine systems. This suggests that automation technology should be integrated into conventional machines. The increasing availability of low cost sensors is encouraging. Over the course of time, the fully automated technology could be expected to slowly replace the existing semi-automatic equipment. Tractors are the workhorses of the modern farm. By automating these machines, the productivity can be increased, safety can be improved, and costs could be reduced for many agricultural operations. An article in the Institute of Food and Agricultural Sciences (IFAS) extension website at the University of Florida reports that mechanical harvesting is the future for the citrus industry (Rouse and Futch, 2005). In citrus 23

PAGE 24

groves, trees are planted in rows with alleys between the rows. The usual alley width is about 2.1 to 2.4m and tree heights vary from 4.5 m to 6 m depending on their age (Brown, 2002). Figure 1-2. Typical citrus grove alleyway This research is aimed at developing an autonomous vehicle capable of maneuvering through the alleyways of citrus groves, turn around after reaching the headlands and navigate other alleyways and finally move from one grove block to another. Such a vehicle is expected to be used for robotic applications in citrus groves, such as acting as a guiding vehicle for robotic harvesting carrying several robotic arms, scout vehicle to look for diseased trees, guiding vehicle for spraying nutrients and pesticides, guide vehicle for mowing the grass in the alleyways and carrying sensors for operations such as disease detection. The research aims at adding on sensors to an existing commercially used vehicle and modifying the vehicles steering to be operated autonomously. The cost for integrating the technology to existing machines is expected to be more attractive for a grower to adopt the technology. There are indications that operating at night may also provide benefits. For example, spraying and harvesting operations could be completed at night. However, development of a system operable at night requires research beyond the scope of this dissertation. In the literature, there has been much research in crop harvesting vehicles, but citrus harvesting is a relatively unexplored area in vehicle guidance. To date, there has been minimal success in developing commercial autonomous navigation systems for citrus grove applications. 24

PAGE 25

Objectives The primary objective of this work was to develop an autonomous vehicle guidance system for navigating citrus groves. The following sub-objectives were also identified: Develop machine vision based path segmentation method for navigating the grove alleyways Develop laser radar (ladar) based path segmentation method for navigating the grove alleyways Retrofit the steering system of the vehicle for computer control Interface the sensors and computers with a commercial vehicle for making the vehicle autonomous Develop steering control algorithm for steering the vehicle and develop a mathematical model of the vehicle if required Develop sensor fusion method to fuse the information from different sensors to improve reliability in navigation Develop headland detection algorithm and implement headland turning maneuvers to navigate the grove headlands after navigating each row Develop obstacle detection and halting ability using laser radar and DGPS based open field navigation method to move between citrus grove blocks 25

PAGE 26

CHAPTER 2 LITERATURE REVIEW Agricultural Autonomous Navigation Applications Vehicles have been used in agriculture for many decades. They have been used for applications such as plowing, planting, harvesting and spraying. These operations require the operator to spend long and tedious hours in the field. Modern technology has facilitated the automation of several agricultural operations thereby reducing operator fatigue and improving productivity. An important part of this automation process includes the use of autonomous vehicles. Research has been conducted in developing autonomous vehicles for agriculture for many years. Some of these researches are mentioned below. Many of these researches were conducted in research laboratories and have not yet fully evolved into commercial products. The reasons are numerous ranging from the economics to insufficient computer processing power. However, the advancements in computer processing power and world economics pertaining to agriculture in recent years have given a new hope to bringing autonomous vehicle from research laboratories to commercial agricultural operations. Most of the research into developing autonomous vehicles in agriculture is concentrated on vehicles for crop operations. Research in the development of autonomous vehicles for orchards and citrus groves are few. Guidance Vehicle The autonomous vehicle can either be in the form of guidance system added on to a commercial agricultural vehicle or a vehicle developed exclusively to aid in autonomous or robotic operations. Blackmore et al. (2002) have listed the following requirements of an autonomous vehicle for agricultural applications Small in size Light weight 26

PAGE 27

Exhibit long-term sensible behaviors Capable of receiving instructions and communicating information Capable of coordinating with other machines Capable of working collaboratively with other machines Behave in a safe manner, even when partial system failures occur Carry out a range of useful tasks These requirements are also applicable to vehicles navigating citrus groves. While current research has not progressed to the level of meeting all these criteria, there has been significant progress. In addition to these criteria, the following requirements may also be expected Low cost Attractive for a farmer to adopt Off-road vehicle performance and reliability An autonomous vehicle is expected to possess the following basic components to be operational: vehicle platform, steering system, guidance system, sensors and a control system. Guidance by Remote Control Remote control or tele-operation has already been used in several branches of robotics. Visual information of the operating environment is relayed to a remote location where the operator controls the vehicle. Murakami et al. (2004) used a CCD camera, a gyroscope and a GPS as the sensors and wireless IP protocol for communication, for tele-operation of an HST drive agricultural vehicle. A joystick was used for tele-operation. A B Figure 2-1. Remote controlled vehicle (Murakami et al., 2004). A) Teleoperated Vehicle. B) User interface. 27

PAGE 28

The major advantage of using a tele-operated vehicle would be the comfortable environment of operation and safety. Another minor advantage compared to the direct guidance would be that this is relatively simple and cheap. A major challenge in using tele-operated vehicles is the time delay in communication. For vehicle guidance, no significant advantages are expected. There is neither much reduction in labor cost nor significant reduction in the amount of work that a driver is required to do. There is also limited possibility of automation. Direct Guidance In this method, the vehicle possesses the ability to control its path. This has the potential for automation and is also economically justifiable on large fields. But the technology is complex. The lack of awareness about the technology and safety concerns as compared to remote control is a cause of apprehension to a farmer. However, present day technology and demonstrations of safe control of the guidance could overcome these limitations. Guidance Using Arm Guidance by mechanical contact has been widely used by earlier researchers in agriculture. This method is simple and straightforward. Often, a mechanical arm senses the crop boundary and the vehicle is moved at a distance from the crop. However if a bare region is encountered in the contact environment, the control is lost. Multiple mechanical contacts have been used to overcome this limitation with moderate success. A major concern in using this method is that there is great potential for the contact element to damage the crop. Yekutieli and Pegna. (2002) used a sensor arm to guide a crawler tractor through a vineyard. The arm sensed the rows and determined the position of the tractor. The arm did not cause any damage to the vines. This method was suitable for a controlled environment like a vineyard where the rows were exactly parallel to each other. 28

PAGE 29

Figure 2-2. Sensor arm configurations used to guide a crawler tractor (Yekuteli et al., 2002) Overhead Guidance In this method, an overhead guide or a track guides the vehicle using a mechanical link. Shin et al. (2002) used overhead rails to direct a vehicle through an orchard. The vehicle was track type and was used for spraying applications. The vehicles performance was acceptable in straight and curved paths, but the vehicle over-steered at higher speeds. This system was only feasible for low speeds of about 0.4 m/s. Also the installation cost of overhead rails is very high. For a grower to adopt a new technology, the high cost of constructing large structures is very discouraging. Hence such a system is not suitable for this project. Figure 2-3. Overhead Guidance (Shin et al., 2002) 29

PAGE 30

Guidance on Tracks and Leader Cables This type of guidance is widely used in industrial transportation. It was used in very early researches in agricultural vehicle guidance. In the system of leader cables, the cables carry an AC signal. The coils on the vehicle base detect magnetic fields when the vehicle moves over the cable. The system works very well in orchards. The drawback is the permanent installation of large structures. This also increases the initial cost and these structures require maintenance over time. There is also little flexibility in the crop grown. Fertilizer application and irrigation require care so that they do not damage the tracks. The drawbacks of the mechanical contact methods and the advancement of sensor technology have drawn researchers to the non-contact methods. Distance Metering and Triangulation Based Guidance In this method, the sensors operate based on signals received from sources installed in the environment for guidance purpose. This popular method places markers throughout the field that a range detector can identify. One method employs a photoelectric sensor for the detection of light from reflectors and measured angles between reflectors. The position of the vehicle equipped with the photoelectric sensor is calculated by triangulation. The reflectors are placed in the environment at regular intervals for detection. Another method involves using a laser beam, which the vehicle follows. A laser beam hits a sensor mounted on the vehicle. The beam has to be constantly moved to guide the vehicle. The above methods all require the expensive installation of structures on the field. They also assume that the crops are grown in some order and that they do not block the field of view of the sensor. Rapid development of sensor technology makes it possible to guide vehicles using information from state of the art sensors that require no modification of the environment. These are the popular guidance techniques in the first decade of the 21st century. 30

PAGE 31

Guidance by Dead Reckoning Dead Reckoning is the process of estimating the position of a vehicle by advancing to a known position using course, speed, time and distance to be traveled. This technology is often used in military and marine navigation. The sensors often used are wheel encoders, which measure the rotation of the wheels. Knowledge of the circumference of the wheel gives the distance traveled. Rotary encoders are being used to detect the wheel position for position location (Kodagoda et al.; Nagasaka et al., 2002). Dead reckoning, a widely used method for positioning unmanned vehicles, has not been a very viable option for agricultural applications due to the uneven soil conditions, which facilitate slipping. Often, in wet weather there is sufficient slip in the tractor wheels. The round off errors also increases causing significant incorrect distance measurements. The above two factors introduce errors in calculating distance. Hence, using an encoder as odometer for finding distance has not been reliable. Guidance Using Ultrasonic Sensors Ultrasonic sensors send a high frequency sound wave. The reflected wave is received at the transducer and based on the time of flight, the distance is calculated. Ultrasonic sensors have been very popular for sensing distance and are able to measure tree canopies, while the vehicle travels at a speed of 1.8 m/s with an accuracy of 1 to 3 cm (Iida and Burks, 2002). The accuracy was found to be higher with lower speeds. These sensors have a requirement for the target to be perpendicular to the sensor for the waves to be reflected back properly (Burks et al., 2004). However, when used in groves, the environment is unstructured and so such surfaces cannot be expected. Hence their use is limited. 31

PAGE 32

Ultrasonic Sensors Figure 2-4. Greenhouse sprayer with ultrasonic sensor (Singh and Burks. 2002) Wireless/Sensor Networks Sensor networks are a relatively new development for vehicle positioning. A set of sensors is located at several places along the path and they communicate with the vehicles computer to decide the path to be traveled. This requires permanent positioning of sensors at regular intervals on the field. This is similar to using a beacon or landmark, but is not advisable in an orchard, since the foliage may block the field of view. Wireless LAN has been tried as a method of communication between the vehicle and base station to get feedback on position accuracy (Nagasaka et al., 2002). Microwave systems can perform the same job effectively (Matsuo et al., 2002), but they require frequent calibration. Wireless LAN could be used more effectively for real time path planning. An operator sitting in the office could send information about the next alleyways to be traversed. Guidance Using Inertial Sensors An inertial measurement unit, or IMU, usually consists of six inertial sensorsthree linear accelerometers and three-rate gyros. These sensors combine to give the vehicles pitch, roll and yaw. Often a magnetic compass is also included in the system to locate the direction. Geomagnetic direction sensor (GDS) was the direction tracker when GPS was not available. GDS has given errors due to the high magnetic field present around the vehicle (Mizushima et al, 32

PAGE 33

2002). Therefore proper shielding is necessary. Mizushima et al. (2002) used an adaptive line enhancer (ALE), which is an adaptive filter method, to eliminate noise in GDS. Gyros have been widely used for inclination measurements (Mizushima et al., 2002). Fiber optic gyro (FOG) has been reported to have given the best performance among the different types of gyros for positioning (Nagasaka et al., 2002). IMUs are widely being used for vehicle guidance research at present. With the combination of RTK GPS and FOG, accuracies of up to 5cm have been achieved (Noguchi et al., 2002). Accelerometers and inclinometers (Matsuo et al., 2002) have also been tried with fairly positive results. At present gyros and inclinometers are available together as inertial measurement units (IMU) for pitch, roll and yaw and linear velocity measurements. Inertial measurements are very useful for navigating the vehicle in uneven grounds, as encountered in groves. GPS Based Guidance Systems Guidance using GPS has been the preferred method of many researchers in agriculture for guiding autonomous vehicles on crop rows. Some of the commercial developments have also concentrated on using this method. Global Positioning System (GPS) receivers locate four or more of the GPS satellites and calculate the distance to each, using this information to deduce their own location. GPS, in combination with inertial navigation systems, have been promising positioning sensors. Both Real Time Kinematic (RTK) GPS and Differential GPS (DGPS) have been satisfactory, allowing real time measurement. However there is a tradeoff between accuracy and cost in the selection of DGPS and RTK GPS receivers, with the latter being more accurate but also more expensive. Nagasaka et al. (2002), Benson et al. (2001) and Noguchi et al. (2002) have found that RTK GPS receivers give very accurate results. Accuracies of up to 7 cm with DGPS (Iida and Burks. 2002) and up to 2 cm with RTK GPS receivers have been reported. More accuracy has been observed when the vehicle is stationary or is at slower speeds than at higher 33

PAGE 34

speeds. Stombaugh et al. (1999) have found that the location of a GPS receiver when used as a position sensor is critical for calculating the phase in frequency response tests of the control system. They found that mounting a GPS receiver above the front wheels gives more accurate navigation, whereas most research results report mounting a GPS receiver on top of the cab. GPS based guidance alone cannot be used for positioning in citrus applications, as it could give errors when the vehicle moves under tree canopies, which block the satellite signals to the GPS receiver. In groves, trees could block the satellite signals. Moreover, a system using GPS for guidance requires that a predetermined path be given to the computer for it to follow. This requires initial mapping. Hence, when a GPS receiver is the sole sensor for positioning, it cannot be used instantly in any grove. Significant time has to be spent in mapping its path. Machine Vision for Autonomous Navigation Machine vision is the ability of a computer to "see." A machine-vision system employs one or more video cameras for obtaining images for the computer to interpret. Machine vision is one of the popular sensing technologies for robotics applications today. Machine vision technology can be used to automatically guide a vehicle when the path to be traversed is visually distinguishable from the background. Vision guidance has the advantage of using local features to adjust the vehicle navigation course. The vision systems performance in guiding the vehicle is comparable to highly accurate laser radar (Burks et al., 2004). Typical applications in vehicle guidance include guiding tractors for crop cultivation and guiding an autonomous vehicle in a greenhouse. This sensing system has been found to work well for fully grown crops, but it has not performed well for crop harvesting with low or sparse crops. 34

PAGE 35

A B C Figure 2-5. Machine vision guidance example (Benson et al., 2001). A) Camera mounted on top of the combine cab. B) Crop image. C) Processed image. The vision systems reliability reduces with reduction in lighting, presence of dust and fog. Benson et al. (2001) overcame this problem by using artificial lighting, but they experienced vehicle shadow problem. They could achieve speeds of up to 1.3 m/s with an accuracy of 0.6 cm. Positioning of the camera is also an important factor for the performance of the vision system. Mounting the camera on the top front of the cab of the tractor, pointing downwards is an optimal position for many agricultural applications. NIR filters can be used to process images from a CCD camera (Nishwaki et al., 2002). This sometimes gives better information depending on the application. Vision involves many complicated algorithms for image processing and recognition. It therefore increases the amount of computation required, resulting in dedicated processors being used. However, as faster computers are developed, this concern will no longer prevent them being used for sensing. 35

PAGE 36

A B C Figure 2-6. Machine vision guidance example (Rovira-Mas et al., 2002). A) Camera mounted on the front of the tractor for detecting crop rows. B) Thresholded image. C) After Hough transform. With computer vision, there is always the need for some physical feature or color difference for the vision system to be able to sense effectively. A number of image processing techniques have been investigated to find the guidance course for different applications. Han et al. (2002) developed a row segmentation algorithm based on k-means clustering to segment crop rows. This information was used to guide the vehicle. Stereo vision has been used for navigating through an orchard (Takahashi et al., 2002). This method used two cameras to obtain the stereo information and required considerable time for processing the images. Vision algorithms based on Hough transform and blob analysis has been used for row crop guidance (Rovira-mas et al., 2002) with positive results. Gerrish et al. (1997) used the ratio of the individual primary colors to the sum of the primary colors to segment images and eliminate shadow problems with success. Often hue, saturation and intensity values are used for thresholding (Morimoto et al., 2002), which has worked well in several applications. Computer vision has been used in several 36

PAGE 37

researches. In practice, many problems exist in the real environment in which light conditions change often and contrasts shift, as is often the case in agriculture. Laser Radar (Ladar) for Autonomous Navigation The operating principle of non-contact laser radar is based on time-of-flight measurement. The ladar calculates the distance to the object using the time of flight of pulsed light, i.e. the length of time between sending and receiving the beam of light. An extremely short pulse of light (infrared laser beam) is transmitted towards an object. Part of the light is then reflected back to the unit, a fraction of a second later. A rotating mirror deflects the pulsed light beam to many points in a semi-circle. The precise direction is given by an angular sensor on the mirror in the ladar. A large number of coordinates measured in this manner are put together to form a model of the surrounding area's contours. A planar scanning ladar gives information of one plane. This can be overcome by rotating the laser source to get a 3 dimensional view. Laser range sensors are very accurate in measuring distance, with accuracy as high as 1 cm over a distance of 80 m. This makes them suitable for guidance applications. Laser radar has been used for ranging and obstacle avoidance. It has higher resolution than ultrasonic sensors, so it is more accurate and requires less computation than vision. However its performance degrades with dust and rain, like vision, and it is more expensive than an ultrasonic sensor. Carmer and Peterson. (1996) have discussed the use of laser radar (ladar) for various applications in robotics. Autonomous vehicle navigation was discussed as a promising application for ladar, due to its ability to accurately measure position. Gordon and Holmes. (1998) developed a custom built ladar system to continuously monitor a moving vehicles position. The systems performance was not very reliable due to underdeveloped technology at that time. Current technologies have advanced to the point that ladar systems are readily available, although they are relatively expensive compared to other alternatives. Ahamed et al. 37

PAGE 38

(2004) used a ladar for developing a positioning method using reflectors for infield road navigation. They tested differently shaped reflectors to determine the accuracy in positioning. Ladar can serve the dual purpose of mapping the tree canopies simultaneously along with navigation. This could be used for grove mapping applications. One such application is for a harvesting robot arm. Figure 2-7. Ladar Mounted on top of the tractor (Yokota et al., 2004) Yokota et al. (2004) had a ladar mounted on top of a tractor and generated local maps of the surroundings and integrated it with GPS data. Ladar has been used for navigating a small vehicle through an orchard (Tsubota et al., 2004). Guidance system using the ladar was found to be more stable than using a GPS. Intelligence for Autonomous Navigation Artificial Intelligence (AI) could be useful in solving complex tasks in which the solution is ambiguous or large computations are necessary. A popular use of AI in robotics is the heuristic search, often used in path planning. Heuristics or empirical rules could be used to shorten the search, as the number of solutions could be very large. To reach an optimal level of autonomous operation, the robot should know how to deal with uncertainties in the environment. Robots should be capable of changing their behaviors independently. Artificial intelligence can immensely aid in this endeavor. 38

PAGE 39

A knowledge-based system could be used in assimilating mission information with internal models to supervise navigation (Speigle et al., 1995). A multiple layer path planning combined with a guidance system using artificial intelligence is presented in Ortiz (1993). A combination of global and local navigation schemes is discussed. Voronoi diagrams have been used in this research for navigating different terrain features. Global navigation was used to establish a path from the source to a destination based on fixed objects in the path and other moving vehicles. A local navigation is also used to ward off immediate obstacles and does not take into account the location of the destination. This approach is similar to one used by a human driver and the guidance is expected to perform well. Vehicle Dynamics Modeling and Steering Control Steering control is a major factor for accurate guidance. The controller gets the error information from the sensors and computes or decides the signal to be given to the steering actuator, thereby steering the vehicle to get back to the correct position in the path. Vehicle Modeling Adequate modeling of the vehicle is often required for efficient control. A model of the vehicle dynamics or steering gives the option of conducting simulations of the vehicle while developing or tuning the steering controllers. Sophisticated models covering more details of the vehicles behavior can enable software simulations of the vehicle to be developed for studying the vehicles behaviors for different terrain changes. Models can help understand the stability of the control system being developed. The most common model for describing vehicles is the bicycle model. It is given by the equation 39

PAGE 40

)()(sdsY = )])12(())2()(*[)2()1(sf2222222222CsfCsrLCsfLCsrLmVsCsrLCsfLlmCsrCsfVsmVlzzsCsfCsrLVsLdsCsfCsrLVsIzzdsmLVC (Eq. 2-1) where Csf = Cornering stiffness of front wheels Csr = Cornering stiffness of rear wheels d = Distance of travel ds = Distance between position sensor and center of gravity lzz = Yaw moment of inertia L = Wheel base, L2 = Distance from center of gravity to rear axle L1 = Distance from center of gravity to front axle, along vehicle axis m = Vehicle mass V = Vehicle forward velocity y = Lateral position of vehicle. This is a Single input single output (SISO) model with steering angle input and lateral position output. This model takes into account the tire slip effects. The assumptions made are that the forward speed is constant and that lateral forces are proportional to side slip angle. Alleyne et al. (1997) found that the bicycle model is easier to implement because it is a linear control system. However the model fails at higher speeds due to higher lateral accelerations. Stombaugh et al. (1999) showed that a classical model-based controller could guide a two-wheel-drive tractor to within 16 cm of the desired path at high speeds, as well as that 40

PAGE 41

a double-integrator transfer function can adequately model lateral deviation of the vehicle to steering angle. They also found the dead band of the steering valve to be crucial for control system design. Zhang et al. (2004) used the bicycle model to model the dynamics of the tractor. Kise et al. (2002) modified the bicycle model to account for nonlinearities, including the slip and cornering forces thereby obtaining a non-linear equation. The vehicle was guided using an optimal controller. Qiu et al. (1999) have modeled an electro-hydraulic steering system using single rod double action steering cylinders and other components for a Case-1H Magnum tractor. The model was developed for a wheel type agricultural vehicle. Computer simulations and field tests validated the model. To sidestep the complexity of modeling the vehicle and then designing the control system, some researchers have used non-modeled techniques, for example, neural networks and non-modeled PID control systems. Neural networks have also been used for modeling the vehicle dynamics (Nijhuis et al. (1992). A Neural network has the advantage of not having complex equations for the model. It also handles the non-linearities. However designing traditional control systems is not straightforward for such a model. Steering Control A steering controller is used for controlling the vehicle steering. The input to the controller can be of various forms such as the lateral error of the vehicle in the path, the position of the vehicle at some future instance, the required steering angle or the curvature the vehicle has to follow. The steering controller can take the input and turn the steering to achieve the desired result while continuously monitoring the steering changes. Several types of steering controllers have been used in the literature by various researchers. 41

PAGE 42

PLC control PLC (Programmable Logic Controller) is a highly reliable special-purpose computer widely used in industrial monitoring and control applications. PLCs typically have proprietary programming and networking protocols, and special-purpose digital and analog I/O ports. A PLC uses programmed logic instructions to control banks of inputs and outputs which interface timed switch actuation to external electro-mechanical devices. PLCs are very fast at processing discrete signals (like a switch condition). PLC has been used for steering control (Nagasaka et al., 2002), controlling all the actuators in the vehicle. The PLC control loop took 2ms to execute. However PLC has not received as much attention as the other control systems. ON/OFF control As the name implies, the control signals are turned on and off to achieve the required control. Yekutieli et al. (2002) experimented with an ON/OFF control to guide a crawler tractor through a vineyard. They found this control to work fast, but also observed a lot of overshoot. A much smoother control could be achieved with more sophisticated control systems. PID control The PID (Proportional Integral Derivative) control algorithm is used for the control of almost all loops in the process industries, and is also the basis for many advanced control algorithms and strategies. PID has given satisfactory performance in tractor guidance. Feedforward + PID control worked well for guiding a tractor through crop rows (Kodagoda et al., 2002). Some time delay has been observed when using PID. However, overall, PID control has performed better than proportional and PI control systems. PID control has been used for guiding a grain harvesting tractor (Benson et al., 2003). In that research, PID was used to calculate the actuator command signal based on the heading offset. The performance of the controller was comparable to that of manual steering. 42

PAGE 43

Adaptive control Nonlinearity is observed in electro hydraulic steering valves. PID control is unable to handle nonlinearities effectively. Tsubota et al. (2004) compared the experimental results from a fuzzy adaptive controller and a neural network-based adaptive controller to those from a classical PID controller in terms of velocity tracking. Experiments were conducted on a test bed and it was found that both adaptive control algorithms compensated the non-linearity effectively. Fuzzy logic Fuzzy logic attempts to create artificial boundaries for analog states, which makes it discrete enough for a digital computer to process. The boundaries are made fuzzy so that the controller does not appear discontinuous. The processing stage usually consists of a set of logic rules defined by the designer. Fuzzy control has been used to guide a tractor to follow crop rows (Benson et al., 2001). The results were comparable with PID. Senoo et al. (1992) have pointed out that the fuzzy controller could achieve better tracking performance than the PI controller. They used a vehicle that tracked a magnetic tape on the floor. It has wider adaptability to all kinds of inputs. Qiu et al. (1999) verified that the fuzzy steering control provided a fast and accurate steering rate control on the tractor. Kodagoda et al. (2002) designed fuzzy PD and fuzzy PI controllers to navigate an electric golf cart. These controllers were compared with the traditional PID controllers, and were found to be insensitive to load fluctuations. They determined that fuzzy control was better than PID for longitudinal control. PID was also found to have large chatter and high saturation. A combination of fuzzy and PID control holds a lot of promise (Burks et al., 2004). While designing these controllers, the designer has to anticipate all the situations that the vehicle might encounter. This can be achieved most of the time with a careful design and the experience of the 43

PAGE 44

designer. However, if the transformation between different scenarios is not properly accounted for, the controller could be discontinuous. Neural networks An Artificial Neural Network (ANN) is an information processing paradigm that is inspired by the way biological nervous systems, such as the brain, process information. It is composed of a large number of highly interconnected processing elements (neurons) working in unison to solve specific problems. ANNs, like people, learn by example. They can learn adaptively by the way they are trained. Nijhuis et al. (1992) were successful in using neural networks for collision avoidance. The vehicle used an infrared range sensor to detect obstacles. But neural networks have the inherent disadvantage of learning only what the driver does; hence they are not robust. To work well, they have to be trained for every possible situation that will be encountered. Behavior-based control Behavior-based control is a new development which has been successfully used in small mobile robots. It works on the basis of addressing the most appropriate next action in a particular situation. This type of control is distributed among a set of behaviors, in which each behavior takes care of one aspect of control. Sensors trigger these behaviors. There is the inherent necessity to coordinate different behaviors for proper operation. Behavior-based systems, in combination with real time control system (RTCS), are expected to do well in vehicle guidance. Optimal control Kise et al. (2002) developed an optimal controller to guide a tractor. The tractor used an RTKGPS and an IMU for sensing position. They compared the optimal control with PI control. The optimal controller could perform high-speed guidance more precisely than the PI controller. The performance was acceptable in 90-degree turns also. 44

PAGE 45

Non-linear control A new field of non-linear control is emerging, particularly for controlling mobile robots. Earlier, researchers used linearized models of non-linear phenomenon. Now, with the availability of fast processors, numerical solutions of non-linear equations can be solved quickly. Therefore non-linear control systems are being developed for improved control. However, research is needed before being widely adopted. Sensor Fusion Applications in Navigation Sensor fusion is a major component of sensor integration, merging multiple inputs with a common representation. Multi-sensor environment generates much data with different resolutions. These are often corrupted by a variety of noise, which continually vary because of changes in the environment. Sensor fusion would extract meaningful information from these data to make the optimal decision for navigation. Several methods, such as Dempster Schafer theory and mapping learning methods, have been used for sensor fusion. Kalman filters have been the ideal candidates for sensor fusion and have been widely used. Several methods for fusion have been reported in the literature. These include, but are not limited to, neural networks (Davis and Stentz, 1995; Rasmussen, 2002), variations of Kalman filter (Paul and Wan, 2005), statistical methods (Wu et al., 2002), behavior based methods (Gage and Murphy, 2000), voting, fuzzy logic (Runkler et al., 1998) and combinations of these (Mobus and Kolbe, 2004). The process of fusion might be performed at various levels ranging from sensor level to decision level (Klein, 1999). Rasmussen (2002) used neural network for combining image and laser data to segment road ways from background objects. He fused features such as color, texture and laser in various combinations using neural network to test various fusion combinations. He found that combinations such as these performed better at segmenting roads compared to individual models. Wu et al. (2002) used Dempster Schafer theory for sensor fusion to combine images and sounds 45

PAGE 46

to interpret human behavior. This probabilistic approach was appropriate for such non-deterministic applications. However, it is felt that probabilistic approaches may take away the accuracy of deterministic applications if accurate statistical models of the process are not created. Runkler et al. (1998) thought Kalman filter to be linearising non-linear models and they developed fuzzy models of signals. Then they projected signals on to these models to fuse the data. Their method helped in reducing noise in their experimental data by using this method of fusion. The Kalman filter has been the subject of extensive research in navigation. It is a set of mathematical equations that provides an efficient means to estimate the state of a process, by computation. It can give an accurate estimate of the present, past and future states of the system. It is useful even when the precise model is not available. The Kalman filter is often used in vehicle navigation for processing sensor signals. Information from different sensors is processed using the Kalman filter to compute reliable position information of the vehicle. In many research, the sensor measurements can be fairly noisy. So the choice of method could also help in reducing the noise and also aid in fusing the information from the sensors. Methods such as neural network, behavior based methods; fuzzy logic and voting are not widely used primarily for reducing noise. Kalman filtering is a widely used method for eliminating noisy measurements from sensor data and also for sensor fusion. Kalman filter can be considered as a subset of the statistical methods because of the use of statistical models for noise. Paul and Wan (2005) used two Kalman filters for accurate state estimation and for terrain mapping for navigating a vehicle through unknown environments. The state estimation process fused the information from three onboard sensors to estimate the vehicle location. Simulated results showed feasibility of the method. Han et al. (2002) used a Kalman filter to filter DGPS 46

PAGE 47

(Differential Global Positioning System) data for improving positioning accuracy for parallel tracking applications. The Kalman filter smoothed the data and reduced the cross tracking error. Based on the good results obtained in the previous research for fusion and noise reduction, a Kalman filter was selected as the method to perform the fusion and filter the noise in sensor measurements. The use of a Kalman filter with fixed parameters has drawbacks. Divergence of the estimates is a problem which is sometimes encountered with a Kalman filter, wherein the filter continually tries to fit a wrong process. Divergence may be attributed to system modeling errors, noise variances, ignored bias and computational round off errors (Fitzgerald, 1971). Another problem in using a simple Kalman filter is that the reliability of the information from the sensors may depend on the type of path. Therefore, if a white noise model is assumed for the process and measurements, the reliability of a sensor in the Kalman filter has to be constantly updated. Abdelnour et al. (1993) used fuzzy logic in detecting and correcting the divergence. Sasladek and Wang (1999) used fuzzy logic with an extended Kalman filter to tackle the problem of divergence for an autonomous ground vehicle. The extended Kalman filter reduced the position and velocity error when the filter diverged. The use of fuzzy logic also allowed a lower order state model to be used. Sasladek and Wang (1999) used fuzzy logic to reduce divergence in the Kalman filter and presented good simulation results. Pure Pursuit Steering Control Pure pursuit is a method for steering vehicles (land, sea or air). In this method, the vehicle is deemed to follow or pursue a point in front of the vehicle and therefore steered such that the vehicles nose points to the point being pursued. Pure pursuit is similar to human driving as humans tend to drive towards some imaginary point in front of the vehicle at different instances i.e. pursuing a point. Therefore it is very intuitive to implement in an autonomous vehicle. In this 47

PAGE 48

method, a look-ahead point is chosen and an arc is constructed joining the present vehicle location and the look-ahead point. Then the vehicle is steered to follow the arc. The method has been in use for several years in different guidance systems. Some of the early work in using pure pursuit for autonomous vehicle steering was in the Carnegie Mellon Terragator and NavLab projects (Coulter, 1992). Pure pursuit guidance has been used for unmanned aerial vehicles (UAV) by Enomoto et al. (2007). In their research an UAV chased an aircraft using pure pursuit. Figure 2-8. UAV using pure pursuit (Enomoto et al., 2007) Petrinec et al. (2003) used pure pursuit based path following for autonomous vehicles in a multi vehicle simulator to simulate factory floor shop. They point out that the look-ahead distance is crucial for proper operation of the vehicle while using pure pursuit. Shorter look ahead distances may cause oscillations whereas longer look ahead though smooth may take the vehicle longer time to get back into the path. As they had to deal with varying look ahead distances, they approximated large look ahead distances using piecewise linear smaller look ahead distances. This reduced the lag due to larger look-ahead distance. 48

PAGE 49

Headland Turning Applications Headland in citrus groves is the space available at the end of each tree row. This space is generally used for turning the vehicles navigating the rows. Headlands are characterized by a ground space of at least 4.25m after the last tree in each row. The ground is usually covered with grass. Beyond this space, there may be trees planted to act as boundaries of the grove or there could be more open ground or some combination of both. An autonomous vehicle operating in citrus groves needs to navigate the headlands to turn and navigate other rows. Miller et al. (2004) have reported that that the space required to turn at the headland and the time spent in turning, significantly affect the field efficiency. They also report that damage of crops due to improper headland turn affects productivity. Kise et al. (2002) tested headland turn using spline function path planning method on a tractor using GPS based guidance system. The method guided the vehicle along the desired path with error less than 20cm lateral deviation. Hague et al. used vision based guidance in combination with dead reckoning to guide a vehicle through crop rows. The guidance system detected end of crop rows using absence of crop row information from vision and distance information from dead reckoning. The vehicle was able to execute a u-turn at the headland. Hansen et al. (2005) analyzed headland turns for a combine in a field. They analyzed loop turn or uturn in particular. The turns were executed for entering successive crop crows or for entering some row after skipping a few rows. They also developed a model to represent the travel behavior and verified with experimental headland turns. Using this, the turns were optimized for speed and time spent in turning and also the number of rows to be skipped for optimal turning. Noguchi et al. (2001) implemented switch back turning maneuver for a tractor to turn at the headlands of crops. The turning maneuver was implemented using DGPS guidance and spline function for turning. Precise turns were observed by simulation and experimentation. Oksanen et al. (2004) studied the behavior of tractortrailer systems in 49

PAGE 50

headlands of crops. They tried to solve the systems turning using optimal control and tried both u turns and switch back turns. The behaviors were analyzed by simulation alone and improvements were suggested using numerical methods for improved solutions. Figure 2-9. Switch back turning implemented using spline functions (Noguchi et al., 2001) Path Planning Path planning aids the mobile robot in traveling from the beginning to the final task. During navigation, conditions might be encountered that require the shortest distance to travel, a safe path, etc. Path planning makes it possible to take care of all these options. Motion trajectories for robots are usually obtained by teaching from a human operator or by path planning on a computer. Geographic Information Systems (GIS) is being widely used for mapping (Noguchi et al., 2002) and path planning. Use of teaching-playback robots is thus restricted to controlled environments, like mass-production factories. Traditional path planners normally use heuristics or a performance index for path specification. A Digital image of the route map could be used for path planning (Speigle et al., 1995). The path is segmented prior to the actual navigation and marked for landmarks. Then an optimal path is calculated. Fu et al. 50

PAGE 51

(2004) have developed a path planning algorithm based on Dijkstras algorithm for searching shortest path. The algorithm searches a restricted area and makes use of the spatial feature of the road network. Current Commercial Systems There are several commercial autonomous vehicles available for field applications. Most of these systems use GPS positioning to aid in straight driving. Large farms employ these for large scale spraying and harvesting in crops. These vehicles provide a visual feedback of the vehicles deviation from the GPS guided path. The farmer can get feedback about his driving directions. This method has increased the efficiency of operation in large farms. Most of these systems are limited to straight path guidance in open fields. They are also not suitable for citrus groves since the receiver in a grove does not properly receive satellites signals, which can be blocked by the tree canopy. John Deere, Moline, IL and iRobot Corp., Boston, MA are developing semi-autonomous vehicles for military use, using a John Deere Gator vehicle. John Deere is also developing an autonomous tractor without a cab and steering wheel. They could save a lot of weight by eliminating these two elements. Omnitech Robotics, Englewood, CO, has developed several tele-operated off-road vehicles, including a tractor. The present commercial systems are opening a new path for advanced autonomous vehicles. The farmers employing these systems at present are more likely to be receptive to a fully autonomous system. A lot of research is required in developing a vehicle capable of autonomous behavior in orchards and citrus groves. Present commercial systems often employ GPS as the lone sensor. The use of additional sensors could make these vehicles more autonomous and applicable in several areas ranging from crops to orchards. 51

PAGE 52

CHAPTER 3 EXPERIMENTAL METHODS This chapter describes the hardware architecture and the theoretical development of the various components of the autonomous vehicle guidance system for navigating citrus groves. Sensors, Control and Hydraulic Architecture of the Tractor Guidance System Hardware Architecture The major components of the autonomous navigation system are the sensors for monitoring the environment of the autonomous vehicle, computer for collecting information from the sensors and providing the autonomous operation and actuators for actuating the steering. From the literature, it is observed that machine vision has been very effective in vehicle navigation. With the availability of fast computers, vision is a viable and promising option. For this research, vision system is used to segment the path in the citrus grove alleyway and guide the vehicle. Laser radar is a highly accurate sensor for measuring the distance to objects. Such accuracy cannot always be expected with vision. Laser radar is available for this project from a previous research and has been a good distance measuring sensor in the literature. Therefore it is also used as one of the guidance sensors. A common PC is used for processing the information from the sensors and to implement high level guidance algorithms. A microcontroller is used for performing low level controls and high speed operations which may not be possible using PC. For controlling the steering of the vehicle, a servo valve was used to control the hydraulic steering. For good control of the steering, feedback of the steering angle is essential. A rotary encoder was available in the laboratory during the start of the project, which was used as the steering angle feedback sensor. 52

PAGE 53

The architecture of the vehicle consists of the vehicle and the sensors and actuators interfaced with the PC and the microcontroller. The ladar and the vision sensors are interfaced with the PC whereas the hydraulic valve and the encoder are interfaced with the microcontroller. The PC processes the vision and ladar information. It then sends required vehicle lateral displacement information to the low level controller. The low level controller gets the desired displacement information from the PC, computes and sends the necessary signal for the hydraulic valve using the control algorithm. It also gets steering angle information from the encoder for the control algorithm. Electrohydraulic Steering Figure 3-1. Architecture of the tractor guidance system Vehicle For navigating citrus groves, the autonomous vehicle was expected to be operated in harsh outdoor environments most of the time, so the platform should have good off-road qualities. PC 2.4GHz Laser Radar Microcontroller + Vision Encoder Amplifier 53

PAGE 54

Uneven paths are encountered in a citrus grove, so strong driving motors are also necessary. The capacity of the vehicle power supply should provide for long hours of operation. The vehicle platform is expected to meet these requirements. Adding the new technology to vehicles currently being produced is expected to be less expensive than the cost incurred to design a totally new vehicle. Moreover, a farmer is more likely to adopt an addition to an existing technology rather than a totally new design. Present day tractors already possess the off-road capabilities, so retrofitting an existing tractor seems to be a viable option. The development of the autonomous vehicle guidance system was started in 2003. At that time, a John Deere 6410 tractor was readily available in the field automation laboratory of the Department of Agricultural and Biological Engineering. It was decided that the tractor already possess the off-road ability required for the autonomous vehicle. Significant cost and time could be saved by starting the development on the tractor, rather than custom designing a new vehicle. A tractor has many advantages in that it is a commercially manufactured, reliable, multi-purpose product technically supported. All the mechanical and hydraulic systems conform to recognized standards. The autonomous guidance system can be retrofitted and the human control interfaces could be modified to include actuators. Therefore, development of the guidance system was started on converting the tractor to operate autonomously. Figure 3-2. John Deere 6410 tractor used for autonomous vehicle guidance development 54

PAGE 55

Autonomous Steering The steering mechanism on the tractor uses hydrostatic steering mechanism. The illustration of the steering mechanism is shown below. Hydraulic Fluid Steering Wheel Steering Steering Valve Cylinder Figure 3-3. Hydrostatic steering mechanism of the tractor In this steering mechanism, the human operator turns the steering wheel. The steering wheel is mechanically coupled to the steering valve. The steering valve can be thought of as the direction control valve. When the steering wheel is turned, the steering valve direction control is changed from no steering to flow in one direction or the other. The hydraulic fluid flows through the steering valve and tries to move the piston in the steering cylinder. The steering cylinder can be thought of as a double acting cylinder. Depending on the flow of fluid controlled by the steering valve, the piston in the steering cylinder is moved to the left or to the right. The wheels of the tractor are mechanically coupled to the shaft of the steering cylinder on either side. This motion of the steering cylinder turns the tractor steering wheels to the right or to the left. For the tractor to operate autonomously, it should be capable of steering by receiving electrical signals from a computer. At the same time, it is important to have the ability to switch to manual control whenever necessary. Since most of the hydraulic circuits in the tractor are proprietary information, the major concern was that the existing system should not be tampered with so that manual capability is not lost. Therefore a steering mechanism was developed to make the steering automatic. The illustration of the hydraulic circuit for this conversion is shown 55

PAGE 56

below. In the Figure 3-4, the hydrostatic steering mechanism of the tractor is shown on the left. For autonomous steering capability, an electro-hydraulic servo valve was connected in parallel tothe existing circuit. This is illustrated in the right half of the Figure 3-4. The specifications of the servo valve are given in the next section. The servo valve is similar to the steering valve described above. The difference is that the servo valve takes analog voltage signals to conflow of fluids, instead of the mechanical turning of the steering wheel by the driver. An electrical switch has been connected to the servo valve. The switch when turned on, makes the servo valve operational. Once the servo valve is turned on, it is ready to accept electrical signals from the computer to control the flow of fluid. The direction of fluid flow to the steering changes the steering to the left or right as described for the hydrostatic steering. Since the valves are closecenter, when one valve is in operation, flow of fluid through the other valve is blocked. Further,the existing steering circuit includes check valves which prevent reverse flow of fluid through the valve when it is not operational. It should be noted that, making trol the d the servo valve operational by turning on the switch, transfual rn fluid lines for the servo valve are tapped from the front end loader suppl ers complete control to autonomous steering. Only when the switch is turned off, mansteering is again made operational. Therefore the human driver can choose manual or autonomous as required. The supply and retu y system (Figure 3-5). The Figure 3-6 shows the retrofitted circuit with the servo valve. Asobserved, the output from the servo valve is directed to the steering cylinder using a T joint, along with the existing steering lines. 56

PAGE 57

Figure 3-4. Electro-hydraulic circuit for autonomous steering Figure 3-5. Rear end of tractor showing the supply to the servo valve Hydrostatic Steering Electro-hydraulic Steering Supply for froend nt loader Supply Return 57

PAGE 58

Existing steering lines to cylinder Servo Valve Return Supply New lines to cylinder Figure 3-6. View under the cab showing the servo valve circuit Servo Valve The servo valve described in the previous section is an electrically controlled load sensing servo valve from Sauer-Danfoss. It is a direction control servo valve with important features like Low weight Built in pressure relief 11V-32V supply voltage 3-position, Closed Center The valve is also manually operable using a lever key. 58

PAGE 59

Table 3-1. Specifications of the electro-hydraulic valve Supply Voltage Range (5% ripple) 11V to 32V Current consumption at rated voltage 0.57 A at 12V supply Signal voltage Neutral A port <-> B port 0.5 Supply voltage 0.25 to 0.75 Supply voltage Power consumption 7W Hysterisis at rated voltage for one cycle 4% Reaction time from neutral to max spool at constant voltage 0.2sec Oil viscosity range 12-75 mm2/s Maximum startup viscosity 2500 mm2/s Oil temperature range 30 60 oC 59

PAGE 60

Figure 3-7. Servo valve showing the main parts (Source: Sauer Danfoss PVG 32 valve documentation) 60

PAGE 61

Guidance Sensor: Camera A video camera was chosen as one of the primary guidance sensors. The camera would enable the guidance system to look at the environment in front of the vehicle and segment the path from the background. From this information, the vehicle could navigate the desired path. The video camera selected for this application was the Sony FCB-EX780S block camera that is commonly used in commercial camcorders. Its analog video output is standard NTSC with both composite and s-video signal output, and is classified as high resolution with 470 TV lines. The camera has a vast number of features that can be adjusted, including a 25x optical zoom, focus, white balance, iris, shutter speed, etc. All of the functions can be controlled via RS-232 serial communications, and many functions can be used in manual or fully automatic mode. The camera was purchased without a commercial casing and so the camera was enclosed in an aluminum custom built casing with mounts for mounting on the vehicle. Figure 3-8. Video Camera, one of the primary path finding sensors 61

PAGE 62

Figure 3-9. Camera mounted on top of the tractor cab The Figure 3-9 shows the camera mounted on top of the tractor cab. Mounting the camera at a high altitude gives a good view of the path for the vision system to segment well. The camera is mounted at an angle of 30o to the horizontal. The field of view of the camera with this mounting configuration ranged from a distance of 5m on the ground in front of the vehicle to the infinite horizon of the path. The camera always initialized to the automatic settings on startup. The automatic settings included automatic shutter speed, automatic aperture control and 1x zoom. Frame Grabber Board The frame grabber is a PCI version of the FlashBus MV Pro, which was added to the computer managing the overall guidance system. The frame grabber converts the analog NTSC video signal received from the camera described above, to a digital 640 x 480 RGB bitmap image. The frame grabber is equipped with a video scaling processor that allows for PCI direct memory access (DMA). This allows for the digital image to be placed directly into system memory without the need for any CPU cycles. The frame grabber also has several features that 62

PAGE 63

include RS-232 serial communications, 12 Volt power, digital to analog converter, and programmable look-up-tables for manipulation of the RGB image. Figure 3-10. Frame grabber board A custom cable was attached to the video camera with the connector to interface with the frame grabber. The software was written to communicate with the frame grabber and obtain images to be processed to determine the path for the vehicle navigation. Guidance Sensor: Laser Radar The laser radar (ladar) system used in this study consisted of a Sick LMS-200 ladar sensor (Sick AG, Waldkirch, Germany). The LMS-200 is a 180 degree one-dimensional sweeping laser which can measure at 1.0/0.5/0.25 degree increments with a 10 mm range resolution and a maximum range of up to 80 m. The LMS-200 can be operated at 38.4 KBaud using RS232 protocol, giving a sweep refresh rate of approximately 7 Hz or at 500kBaud using RS422 protocol, giving a sweep refresh rate of approximately 35Hz. For operation at 500kBaud, a special high speed serial communication card was used. The ladar also consists of a 24V external power supply to power the unit. 63

PAGE 64

A B Figure 3-11. Laser radar and its mount. A) Laser radar. B) Laser mounted on top of the tractor. The laser radar is mounted on top of the tractor cab just below the camera. It is also positioned at 30o to the horizontal like the camera. This position allows for the laser radar to scan the front of the tractor for obstacles. The laser radar was used as the second guidance sensor to determine the path for navigation. Encoder For the purpose of feeding back the steered wheel angle to the control system, a rotary encoder was used. A Stegmann HD20 Heavy Duty encoder was selected. Figure 3-12. Stegmann HD20 Encoder The major features are 1024 pulses/revolution 8-24V input/output 64

PAGE 65

0.23Kg, 2inch diameter Max speed 3000rpm at high load Output: Two square waves in quadrature Figure 3-13. Output from the encoder shows two waves in quadrature As shown in the Figure 3-13, by measuring the phase shift between the two square waves, the direction of rotation can be calculated. The number of pulses after the last counted pulse gives the amount of rotation. Linkage Encoder Figure 3-14. Encoder mounted on the tractor axle 65

PAGE 66

The encoder was mounted on the shaft of the left wheel using a linkage assembly as shown in the Figure 3-14. When the steering was performed, the encoder assembly rotates and the signal from the encoder can be counted for measuring the steered angle. The encoder along with the linkage was calibrated. The amount of rotation of the encoder shaft is different from the amount of rotation of the wheel because of the linkage assembly connecting the encoder and the wheel. Computer An industrial PC was used for performing high level algorithm processing and managing the overall guidance system. The PC used in this research was the Advantech PCA-6106P3-C, Pentium 4, 2.4GHz processor containing two RS232 serial ports. It was mounted in the cabin next to the drivers seat along with the keyboard and monitor. The operating system used was Windows 2000 Professional. The software used for implementing the algorithms for vision and laser radar is Microsoft Visual C++.Net. Use of this software allows for the creation of GUI for analyzing the algorithm behavior in real time. One of the drawbacks of using windows is that it requires a time of about 15msec between successive runs of the algorithms, for housekeeping. This speed for processing is sufficient for the vision and ladar algorithms. But this is too slow to process the information from higher speed sensors such as rotary encoders. A B Figure 3-15. Instruments mounted in the tractor cabin. A) Computer. B) Computer, monitor and keyboard mounted in the cabin. 66

PAGE 67

Microcontroller For the rate of steering turn of the vehicle, a high sampling rate of the encoder was anticipated. Initially, the encoder was interfaced using a National Instruments data acquisition board interfaced with the computer. When initial laboratory tests were conducted to measure the output from the decoder, the encoder shaft was manually rotated and the pulses were read. It was found that high sampling rates could not be achieved using this interface setup. It was also anticipated that the operating speed of the Windows operating system may not be fast enough for high speed control. Therefore, the Tern microcontroller was used for low level high speed operations. A microcontroller was used for high speed low level control operations. The microcontroller used in this research was the 586 Engine controller board with a P50 expansion board from TERN Inc. It is a C++ programmable controller board based on a 32-bit 100/133MHz AMD Elan SC520 microprocessor. Its important features are as follows: 32 PIOs, 7 Timers, 19 channel 12 bit ADC, 8 channel 12 bit DAC, 2 quadrature decoders, 8 opto-couplers, 14 high voltage IO, Serial port and Ethernet. The microcontroller was used for processing encoder data and for controlling the servo valve for steering, by sending analog voltages. The microcontroller was mounted in the tractor cabin below the PC in a closed box. Figure 3-16. Microcontroller 67

PAGE 68

Amplifier Circuit The servo valve requires control voltage in the range of 2Vs -3 V to 2Vs +3 V with a supply voltage of Vs or, if the supply voltage is 12 V, then the control voltage is expected in the range of 3 V to 9 V. Therefore the off position or closed position of the valve is at 6V, which is the middle value. Voltages from 6V to 9V correspond to flow of fluid in one direction and 6V to 3V for the opposite direction. The microcontroller analog output has a range of 0-2.5 V. So to amplify and shift the voltage from 0-2.5 V to 3-9 V, an amplifier circuit was necessary. The circuit consisted of an LM324 Op Amp as the amplifier. The 12 V supply for the circuit is taken from the tractors battery unit. The input to the circuit is the 0-2.5 V analog output from the microcontroller. This 0-2.5 V range is scaled to 3 to 9 V analog voltage in the output of the circuit. This is the control voltage for the servo valve. It is observed that the tractor battery voltage varies from 12 V to 13 V after starting the tractor or when the throttle is increased. The amplifier circuit takes the same supply power as the hydraulic valve. Therefore whatever the supply voltage is from the tractor battery, the circuit scales it accordingly. Figure 3-17. Amplifier Circuit 68

PAGE 69

DGPS Receiver A DGPS receiver was used to measure the vehicle displacement while conducting tests to determine the dynamics of the vehicle. A John Deere Starfire Differential GPS was used. Figure 3-18. DGPS receiver mounted on top of the tractor cab RS-232 serial communication was used to communicate with the DGPS receiver and position information was obtained at a rate of 5Hz. According to the manufacturers specifications, it has real time accuracy better than 1ft. A 24V external battery was used to power the DGPS receiver. Power Supply The PC, the monitor and the laser radar require a supply voltage of 120 V AC and the hydraulic valve and the microcontroller require a 12 V DC. Power from the tractor battery is tapped and given as input to an inverter. The inverter supplies the PC, the monitor and the laser radar. The supply for the microcontroller and the hydraulic valve is taken from the cigarette lighter power source in the tractor cabin. 69

PAGE 70

Figure 3-19. Inverter mounted in the tractor cabin Inertial Measurement Unit In this research, an Inertial Measurement Unit (IMU) was used to measure the direction of travel of the vehicle/tractor. The IMU used was the 3DM tri-axial IMU from Microstrain, USA. Figure 3-20. Inertial Measurement Unit used in this research This IMU is a 3-axis sensor measuring 360o of yaw, 360o of pitch and 140o of roll. The measurement is done using arrays of magnetometers and accelerometers. RS-232 standard of serial communication was used to obtain the measurement values from the sensor. The IMU requires a supply voltage of 5.3V to 12V. Calibration of the IMU was performed before using it in the vehicle. Calibrations should be repeated when mounting the IMU directly over metal. The IMU was mounted on the computer inside the cabin for the tractor. The IMU was mounted such 70

PAGE 71

that the yaw measurement is zero when the rear of the vehicle points towards the geographic north. The sampling rate obtained from the sensor using the guidance program was about 40Hz. Speed Sensor An ultrasonic speed sensor was used to measure the forward speed of the tractor. The sensor used was the Trak-Star from Micro-Trak systems, Inc., USA. The sensor was mounted under the tractor body such that the ultrasonic waves emitted from the sensor hit the ground. The output from the sensor was digital voltage calibrated against speed by the manufacturer. The sensor was interfaced using a PIC microcontroller to convert the digital signal and give an output in RS-232 serial communication standard. The speed information was obtained using serial communication using the guidance program. Experimental Approaches and Theoretical Development Vehicle Dynamics, Calibration and Steering Control of Tractor Valve calibration The PVG 32 proportional valve used for steering control is a high performance servo valve. The amount of opening of the valve, and thereby the wheel turn, is dependent upon the control voltage applied and the time for which it is applied. The effect of a phenomenon like hysteresis is compensated by the built-in servo control in the valve to position the spool exactly at the commanded position. To understand how the wheels turned with voltage and time, calibration tests were conducted. These calibrations are important for the control system implementation. Tests were conducted to make a look-up table of voltage, time and the corresponding wheel angle. In this test, the tractor engine was started and the vehicle was kept stationary. A program was written in the microcontroller to send a constant analog voltage to the servo valve. The wheels were allowed to turn 100 degrees and the valve was turned off or closed. The time required to turn this angle was measured. Tests were conducted at various constant 71

PAGE 72

voltages. Voltage vs Time plot was plotted as shown in the Figure 3-21. From the plot, it can be observed that the true close position of the valve is at 6.8V rather than exactly at 6V. This could be due to variation of supply voltage from the tractor from 12V to 13V or due to the dynamics of the valve itself. It can be observed from the plot that for voltages further from the center value, the time required to turn is less as expected. The shortest time is 0.5s for 8.5V and 5.5V. Further values of voltages were not applied as such high speeds of turn are not expected of the tractor during autonomous guidance. For voltages less than 7.2V or more than 6.5V, there was no significant turn of the tractor wheels. It is also observed that right turn has a larger range of voltages compared to left run. This could be due to the dynamics of the valve and the dynamics of the various components of the hydraulic circuit. 44.555.566.577.588.590.60.750.91.254.3Time (sec)Voltage (Volts) Right Left Figure 3-21. Voltage and time required to turn 100 degrees It is to be noted that the flow of hydraulic oil in the lines is dependent on the viscosity of the oil. The viscosity of the oil is in turn dependent on its temperature. The viscosity of the oil varies from 12 mm2/s to 75 mm2/s, as the oil is heated up. During the course of the experiments, it was observed that an initial period of about 15 min is required after starting, for the oil to reach 72

PAGE 73

a constant viscosity. For experiments conducted in the field, the tractor was driven out to the field. This time was sufficient to heat the oil. As a result, no startup time was required before each experiment. Further, the valve dynamics frequency is much higher than the vehicle dynamics frequency. Therefore the effect of valve dynamics on the vehicle control can be ignored without severely degrading the performance. Encoder calibration The next step was the calibration of the encoder and the linkage assembly. As discussed earlier, the encoder outputs 1024 pulses per revolution. The encoder position had to be calibrated against the wheel angle. This calibration is necessary for the steering control. 0 15 30 15 45 30 45 wheel Figure 3-22. Encoder calibration The tractor was positioned at a place in the laboratory. Angular positions were marked on the ground as shown in the Figure 3-22. From the center position, the steering wheel was rotated to position the wheels at different angles. The number of pulses to reach different angles was noted using a program written in the microcontroller to decode the pulses from the encoder. The angle range for this calibration was from 0 to 45 degrees. The wheel center was treated as zero 73

PAGE 74

and turns were to a maximum of 45 degrees on either side from the center. It is to be noted that the wheel center was calibrated by trial and error. The maximum angle was 45 degrees as this was the mechanical turn limitation of the wheels. From this calibration, it was determined that a rotation of 1 degree caused the encoder to output 7 pulses. Vehicle dynamics The vehicle dynamics has to be determined for proper design of the control system. The vehicle dynamics relates the lateral motion of the vehicle to steering inputs. Open loop frequency response tests were conducted to explore the vehicle dynamics. The open loop frequency response test is good for modeling transfer functions from physical data. Open loop frequency response. The frequency response is a representation of the systems response to sinusoidal inputs of varying frequency. For the open loop frequency response, an algorithm was implemented in the microcontroller to move the wheel angle as a time based sinusoid. The computed voltage for the required frequency of turn was sent to the valve, from the microcontroller, which turned the wheels at various instances of time. The encoder fed back the wheel angle information. This closed loop control caused the vehicle to execute a sinusoidal motion. For the tests, the engine was set at a constant speed of 2200 rpm. Tests were conducted in an open field at three different vehicle forward speeds of 4 mph, 7 mph and 10 mph. These speeds were representative of operating speeds expected while navigating citrus groves. The frequency was varied from 0.07 Hz to 0.3 Hz. The lowest frequency was chosen as 0.07 because at frequencies below this and at maximum speed, the vehicle veered out of the test area. Above 0.3 Hz, the time period of wheel turn was very low, such that the very high velocity of turn was required. This is not suitable for the tractor. These constraints limited the bandwidth of tests that could be conducted. Time, wheel angle and vehicle position were recorded. The DGPS receiver was used to record the vehicle position. There were no tall trees or 74

PAGE 75

buildings near the vehicle in the test area to affect the GPS signals to the receiver. Using the position information collected with the DGPS receiver, the position of the vehicle was plotted in ArcView, software commonly used for analyzing GPS data. Table 3-2. Open loop frequency response tests at various speeds and frequencies Frequency (Hz) Speed (mph) 0.07 0.08 0.09 0.1 0.2 0.3 4 X X X X X X 7 X X X X X X 10 X X X X X X N 5 m Figure 3-23. Vehicle position plotted in ArcView (0.2Hz, 10mph) The vehicle lateral deviation plot in the Figure 3-23 shows the sinusoidal movement of the vehicle. It was observed that the vehicles lateral response contains a sinusoidal component at the frequency of the steering excitation. ArcView has a tool to measure distances on its plots. Using this tool, the gain or the change in the position of the vehicle from a positive peak to the successive negative peak was measured, as shown in the Figure 3-24. 75

PAGE 76

Ga in Figure 3-24. Measuring gain from the position plot This gain corresponds to a particular frequency at which the tractor was steered. This calculation was performed for all the frequencies and speeds used in the test. The gains were plotted against the frequencies (Figure 3-25). The Figure 3-25 shows the open loop frequency response for 4 mph, 7 mph and 10 mph. As observed, the frequency response was relatively linear. The gain relates the lateral deviation, d(s) of the vehicle to the steering angle change, a(s). Magnitude (dB) -0.2 -0.3 Figure 3-25. Open loop frequency response for 4 mph, 7 mph and 10 mph 76

PAGE 77

This is represented by the transfer function G(s) = )()(sasd (Eq. 3-1) Determining the Theoretical Model. From the open loop frequency response, it was observed that the vehicle lateral response has the same frequency as the steering excitation. It is also known that when a constant input is given, that is, if the steering angle is kept constant, the lateral response is initially exponential. An exponential response to a constant input is exhibited by a double integrator function. Double integrator also introduces a 180 degree phase shift in the response. Therefore, if the input is a sine function, the output is a cosine function shifted by 180 degrees, or otherwise, a negative sine function. From the bicycle model discussed in the literature review, it is known that the vehicle dynamics model has two poles. This information also corroborated that the vehicle model could be represented by a double integrator. Hence, the transfer function is determined to be of the form G(s) = s^2K (Eq. 3-2) The value of gain K depends on the forward speed of the vehicle. The frequency response of the transfer function, G(s), was plotted using different values of K, to fit the actual response (Figure 3-26). K was found to be 0.02, 0.07 and 0.13 for 4 mph, 7 mph and 10 mph respectively. It can be concluded that an increase in velocity can be represented by an increase in the gain K, under identical conditions. This model is similar to the bicycle model, except that the bicycle model has two additional pole-zero pairs. 77

PAGE 78

Magnitude (dB) Figure 3-26. Theoretical model for the three speeds To test the validity of the theoretical model proposed, the theoretical model was created in Simulink and simulated. The model is shown in the Figure 3-27 and consists of two integrator blocks connected to a gain. The input is a signal generator and the output is a scope. Sinusoidal wheel angle inputs were given at various frequencies. The experimental results from the open loop frequency response tests were found to agree with the simulated results as shown in Figure 3-28. Figure 3-27. Simulink model for the vehicle dynamics 78

PAGE 79

Figure 3-28. Sinusoidal response of the theoretical model and the vehicle (10 mph, 0.2 Hz) Control system Based on the open loop frequency response tests described previously, a control system was designed to control the steering angle. PID control system has been found to work well for vehicle guidance in the literature. It is also a simple and reliable method. Hence PID control system was chosen for implementation (Figure 3-29). Actual An g le Error in Dis p lacement Desired An g le Lateral Displacement Desired Displacement E Figure 3-29. Guidance control block diagram 79

PAGE 80

The steering control system is developed to interface between the PC and the servo valve and would be implemented in the microcontroller. The input to the PID controller would be the lateral displacement error of the vehicle from the center of the path to be navigated. The output of the PID controller would be the desired steering angle to reduce this error. This desired angle is input as analog voltage to the servo valve. The actual steering angle is fed back to the PID controller using the rotary encoder. This forms a closed loop control for steering. The lateral error of the vehicle from the path center is fed back using the primary guidance sensors namely vision and ladar. This feedback of the error forms the overall closed loop of the guidance system. The control system was first designed and simulated in Simulink. Simulations of the control system can give good estimates of the controller gains, which can then be further tuned in the actual vehicle. The Simulink model is shown in the Figure 3-30. A rate limiter was added to simulate the fact that the wheel turning is not instantaneous. A saturation block was added to simulate the maximum voltage that can be given as input to the valve. Figure 3-30. Simulink model of the vehicle control system 80

PAGE 81

Ziegler Nichols method of tuning for closed loop systems was adopted for initial tuning of the PID controller. In this tuning process, The system was given a small proportional gain. Integral and derivative gains were removed. The gain was increased until sustained oscillations were obtained as shown in the Figure 3-31. Figure 3-31. Simulated sustained oscillation with a gain of 1.8 for the 10 mph model The gain Gu and time period of oscillation, Tu were recorded. The required proportional gain Kp and derivative gain Kd were calculated using the Ziegler Nichols formula a. Kp = 0.6 Gu (Eq. 3-3) b. Kd = Tu/8 (Eq. 3-4) 81

PAGE 82

Figure 3-32. Simulated step response with the initial tuning parameters (10mph, Kp=1.08, Kd = 1.25) The parameters were further tuned and an integral gain was added for removing the steady state error. The controller was tuned for zero steady state error and low overshoot. A large settling time of the order of 20 sec was acceptable for the vehicle. This is because, if the controller were tuned for a small settling time, the steering was very sensitive to very small errors and caused the wheels to chatter. The optimum PID gains were found to be: Proportional Gain, Kp = 1 Derivative Gain, Kd = 4 Integral Gain, Ki = 0.09 When the control system was implemented in the tractor, the same parameters determined for the simulation were used to control the vehicle and no significant further tuning was required. 82

PAGE 83

Figure 3-33. Simulated step response with the final tuned parameters (10mph) Table 3-3. Performance criteria of the controller (Simulated results) Parameter Model Rise time (sec) Settling time (sec) Delay time (sec) Overshoot (cm) Steady state error 4 mph 15 10 8 1 0 7 mph 12 15 6 1 0 10 mph 12 20 5 2 0 Vision and Ladar Independent Steering The primary purpose of this project was to guide the vehicle through the alleyways of a citrus grove. However, the performance of the vehicle guidance needs to be tested well before testing in the grove. Also the location of the groves not being in proximity to the laboratory poses a challenge of towing the tractor for a long distance. So if a modification to the equipment or algorithm is required, a lot of time is wasted in towing th