Citation
Autonomous Vehicle Guidance for Citrus Grove Navigation

Material Information

Title:
Autonomous Vehicle Guidance for Citrus Grove Navigation
Creator:
Subramanian, Vijay
Place of Publication:
[Gainesville, Fla.]
Publisher:
University of Florida
Publication Date:
Language:
english
Physical Description:
1 online resource (314 p.)

Thesis/Dissertation Information

Degree:
Doctorate ( Ph.D.)
Degree Grantor:
University of Florida
Degree Disciplines:
Agricultural and Biological Engineering
Committee Chair:
Burks, Thomas F.
Committee Co-Chair:
Lee, Won Suk
Committee Members:
Arroyo, Amauri A.
Slatton, Kenneth C.
Dixon, Warren E.
Moskow, Shari
Graduation Date:
8/9/2008

Subjects

Subjects / Keywords:
Algorithms ( jstor )
Computer vision ( jstor )
Global positioning systems ( jstor )
Groves ( jstor )
Headlands ( jstor )
Navigation ( jstor )
Sensors ( jstor )
Speed ( jstor )
Steering ( jstor )
Tractors ( jstor )
Agricultural and Biological Engineering -- Dissertations, Academic -- UF
autonomous, control, guidance, image, laser, robot, unmanned, vehicle, vision
Genre:
Electronic Thesis or Dissertation
born-digital ( sobekcm )
Agricultural and Biological Engineering thesis, Ph.D.

Notes

Abstract:
Automation and robotics is becoming an important part of agricultural operations in the 21st century. Exploration is currently under way to use robotics in citrus groves. An important part of robotics in citrus groves is the use of autonomous vehicles. The development of guidance systems for an autonomous vehicle operating in citrus groves is presented in this dissertation. The guidance system development is based on two different vehicles, a commercial Tractor and a golf cart type vehicle, eGator. Initial developments were based on the Tractor and the later developments were based on the eGator. The guidance system development process consisted of choosing sensors for implementing the guidance system, the mounting of the sensors and developing hardware and software interface for the sensors with the computer, the development of algorithms for processing the information from the sensors to determine the path for the vehicle, the development of steering control system to steer the vehicle in the desired path; the development of the software for interfacing the sensors and implementing the algorithms, and conducting experiments to validate the operation of the autonomous vehicle in citrus groves. The primary sensors used for determining the path for the vehicle were, video camera and laser radar. The important methods discussed in this dissertation are the determination of the vehicle dynamics of the tractor; the development of PID based steering control system for steering the tractor machine vision, machine vision and laser radar based path segmentation algorithms for navigating the vehicle through grove alleyways, sensor fusion based path determination by fusing the information from machine vision and laser radar, Pure Pursuit method of steering the eGator, algorithms for turning the vehicle at the headlands of the citrus grove using sweeping ladar, GPS based navigation of open fields between grove blocks and laser radar based obstacle detection. The ability of the autonomous vehicle to navigate the paths was experimentally verified in custom designed test tracks and in citrus groves. The development of the autonomous vehicle guidance system showed machine vision and ladar based systems as promising and accurate methods for navigating citrus groves. The autonomous vehicle possesses the ability to navigate the citrus grove alleyways in the middle of the path, turn around at the headlands and navigate subsequent alleyways, navigate open fields between grove blocks, and detect and stop in the presence of obstacles. ( en )
General Note:
In the series University of Florida Digital Collections.
General Note:
Includes vita.
Bibliography:
Includes bibliographical references.
Source of Description:
Description based on online resource; title from PDF title page.
Source of Description:
This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Thesis:
Thesis (Ph.D.)--University of Florida, 2008.
Local:
Adviser: Burks, Thomas F.
Local:
Co-adviser: Lee, Won Suk.
Electronic Access:
RESTRICTED TO UF STUDENTS, STAFF, FACULTY, AND ON-CAMPUS USE UNTIL 2010-08-31
Statement of Responsibility:
by Vijay Subramanian.

Record Information

Source Institution:
University of Florida
Holding Location:
University of Florida
Rights Management:
Copyright Subramanian, Vijay. Permission granted to the University of Florida to digitize, archive and distribute this item for non-profit research and educational purposes. Any reuse of this item in excess of fair use or other copyright exemptions requires permission of the copyright holder.
Embargo Date:
8/31/2010
Classification:
LD1780 2008 ( lcc )

Downloads

This item has the following downloads:

subramanian_v.pdf

subramanian_v_Page_088.txt

subramanian_v_Page_015.txt

subramanian_v_Page_121.txt

subramanian_v_Page_094.txt

subramanian_v_Page_255.txt

subramanian_v_Page_174.txt

subramanian_v_Page_281.txt

subramanian_v_Page_079.txt

subramanian_v_Page_125.txt

subramanian_v_Page_108.txt

subramanian_v_Page_021.txt

subramanian_v_Page_014.txt

subramanian_v_Page_005.txt

subramanian_v_Page_171.txt

subramanian_v_Page_018.txt

subramanian_v_Page_110.txt

subramanian_v_Page_033.txt

subramanian_v_Page_024.txt

subramanian_v_Page_098.txt

subramanian_v_Page_173.txt

subramanian_v_Page_213.txt

subramanian_v_Page_136.txt

subramanian_v_Page_166.txt

subramanian_v_Page_248.txt

subramanian_v_Page_089.txt

subramanian_v_Page_095.txt

subramanian_v_Page_252.txt

subramanian_v_Page_157.txt

subramanian_v_Page_228.txt

subramanian_v_Page_202.txt

subramanian_v_Page_074.txt

subramanian_v_Page_198.txt

subramanian_v_Page_016.txt

subramanian_v_Page_265.txt

subramanian_v_Page_312.txt

subramanian_v_Page_100.txt

subramanian_v_Page_048.txt

subramanian_v_Page_178.txt

subramanian_v_Page_297.txt

subramanian_v_Page_286.txt

subramanian_v_Page_190.txt

subramanian_v_Page_259.txt

subramanian_v_Page_253.txt

subramanian_v_Page_119.txt

subramanian_v_Page_302.txt

subramanian_v_Page_034.txt

subramanian_v_Page_045.txt

subramanian_v_Page_040.txt

subramanian_v_Page_141.txt

subramanian_v_Page_274.txt

subramanian_v_Page_099.txt

subramanian_v_Page_234.txt

subramanian_v_Page_164.txt

subramanian_v_Page_051.txt

subramanian_v_Page_158.txt

subramanian_v_Page_115.txt

subramanian_v_Page_084.txt

subramanian_v_Page_114.txt

subramanian_v_Page_284.txt

subramanian_v_Page_002.txt

subramanian_v_Page_046.txt

subramanian_v_Page_009.txt

subramanian_v_Page_137.txt

subramanian_v_Page_052.txt

subramanian_v_Page_160.txt

subramanian_v_Page_082.txt

subramanian_v_Page_001.txt

subramanian_v_Page_277.txt

subramanian_v_Page_306.txt

subramanian_v_Page_092.txt

E97W4OKAP_NK9KGU_xml.txt

subramanian_v_Page_187.txt

subramanian_v_Page_242.txt

subramanian_v_Page_211.txt

subramanian_v_Page_177.txt

subramanian_v_Page_273.txt

subramanian_v_Page_207.txt

subramanian_v_Page_182.txt

subramanian_v_Page_043.txt

subramanian_v_Page_186.txt

subramanian_v_Page_091.txt

subramanian_v_Page_147.txt

subramanian_v_Page_039.txt

subramanian_v_Page_299.txt

subramanian_v_Page_081.txt

subramanian_v_Page_243.txt

subramanian_v_Page_226.txt

subramanian_v_Page_150.txt

subramanian_v_Page_037.txt

subramanian_v_Page_215.txt

subramanian_v_Page_189.txt

subramanian_v_Page_268.txt

subramanian_v_Page_227.txt

subramanian_v_Page_146.txt

subramanian_v_Page_184.txt

subramanian_v_Page_030.txt

subramanian_v_Page_155.txt

subramanian_v_Page_031.txt

subramanian_v_Page_201.txt

subramanian_v_Page_300.txt

subramanian_v_Page_197.txt

subramanian_v_Page_168.txt

subramanian_v_Page_225.txt

subramanian_v_Page_267.txt

subramanian_v_Page_138.txt

subramanian_v_Page_196.txt

subramanian_v_Page_059.txt

subramanian_v_Page_071.txt

subramanian_v_Page_041.txt

subramanian_v_Page_010.txt

subramanian_v_Page_113.txt

subramanian_v_Page_122.txt

subramanian_v_Page_069.txt

subramanian_v_Page_011.txt

subramanian_v_Page_123.txt

subramanian_v_Page_103.txt

subramanian_v_Page_269.txt

subramanian_v_Page_008.txt

subramanian_v_Page_077.txt

subramanian_v_Page_261.txt

subramanian_v_Page_073.txt

subramanian_v_Page_131.txt

subramanian_v_Page_232.txt

subramanian_v_Page_235.txt

subramanian_v_Page_068.txt

subramanian_v_Page_049.txt

subramanian_v_Page_221.txt

subramanian_v_Page_109.txt

subramanian_v_Page_127.txt

subramanian_v_Page_210.txt

subramanian_v_Page_144.txt

subramanian_v_Page_135.txt

subramanian_v_Page_249.txt

subramanian_v_Page_063.txt

subramanian_v_Page_246.txt

subramanian_v_Page_218.txt

subramanian_v_Page_176.txt

subramanian_v_Page_311.txt

subramanian_v_Page_134.txt

subramanian_v_Page_156.txt

subramanian_v_Page_181.txt

subramanian_v_Page_120.txt

subramanian_v_Page_195.txt

subramanian_v_Page_258.txt

subramanian_v_Page_162.txt

subramanian_v_Page_244.txt

subramanian_v_Page_308.txt

subramanian_v_Page_263.txt

subramanian_v_Page_161.txt

subramanian_v_Page_294.txt

subramanian_v_Page_237.txt

subramanian_v_Page_264.txt

subramanian_v_Page_053.txt

subramanian_v_Page_013.txt

subramanian_v_Page_260.txt

subramanian_v_Page_056.txt

subramanian_v_Page_116.txt

subramanian_v_Page_083.txt

subramanian_v_Page_266.txt

subramanian_v_Page_192.txt

subramanian_v_Page_060.txt

subramanian_v_Page_029.txt

subramanian_v_Page_191.txt

subramanian_v_Page_057.txt

subramanian_v_Page_129.txt

subramanian_v_Page_247.txt

subramanian_v_Page_096.txt

subramanian_v_Page_165.txt

subramanian_v_Page_287.txt

subramanian_v_Page_310.txt

subramanian_v_Page_272.txt

subramanian_v_Page_035.txt

subramanian_v_Page_019.txt

subramanian_v_Page_148.txt

subramanian_v_Page_271.txt

subramanian_v_Page_154.txt

subramanian_v_Page_085.txt

subramanian_v_Page_170.txt

subramanian_v_Page_064.txt

subramanian_v_Page_223.txt

subramanian_v_Page_303.txt

subramanian_v_Page_159.txt

subramanian_v_Page_307.txt

subramanian_v_Page_304.txt

subramanian_v_Page_075.txt

subramanian_v_Page_200.txt

subramanian_v_Page_276.txt

subramanian_v_Page_112.txt

subramanian_v_Page_061.txt

subramanian_v_Page_104.txt

subramanian_v_Page_032.txt

subramanian_v_Page_309.txt

subramanian_v_Page_285.txt

subramanian_v_Page_295.txt

subramanian_v_Page_240.txt

subramanian_v_Page_206.txt

subramanian_v_Page_278.txt

subramanian_v_Page_296.txt

subramanian_v_Page_262.txt

subramanian_v_Page_080.txt

subramanian_v_Page_038.txt

subramanian_v_Page_199.txt

subramanian_v_Page_093.txt

subramanian_v_Page_070.txt

subramanian_v_Page_283.txt

subramanian_v_Page_086.txt

subramanian_v_Page_036.txt

subramanian_v_Page_022.txt

subramanian_v_Page_212.txt

subramanian_v_Page_193.txt

subramanian_v_Page_062.txt

subramanian_v_Page_313.txt

subramanian_v_Page_219.txt

subramanian_v_Page_314.txt

subramanian_v_Page_128.txt

subramanian_v_Page_028.txt

subramanian_v_Page_065.txt

subramanian_v_Page_007.txt

subramanian_v_Page_220.txt

subramanian_v_Page_175.txt

subramanian_v_Page_149.txt

subramanian_v_Page_275.txt

subramanian_v_Page_257.txt

subramanian_v_Page_026.txt

subramanian_v_Page_239.txt

subramanian_v_Page_142.txt

subramanian_v_Page_289.txt

subramanian_v_Page_152.txt

subramanian_v_Page_117.txt

subramanian_v_Page_111.txt

subramanian_v_Page_270.txt

subramanian_v_Page_101.txt

subramanian_v_Page_050.txt

subramanian_v_Page_106.txt

subramanian_v_Page_238.txt

subramanian_v_Page_139.txt

subramanian_v_Page_205.txt

subramanian_v_Page_301.txt

subramanian_v_Page_087.txt

subramanian_v_Page_224.txt

subramanian_v_Page_118.txt

subramanian_v_Page_140.txt

subramanian_v_Page_256.txt

subramanian_v_Page_072.txt

subramanian_v_Page_132.txt

subramanian_v_Page_066.txt

subramanian_v_Page_180.txt

subramanian_v_Page_163.txt

subramanian_v_Page_194.txt

subramanian_v_Page_254.txt

subramanian_v_Page_188.txt

subramanian_v_Page_078.txt

subramanian_v_Page_130.txt

subramanian_v_Page_124.txt

subramanian_v_Page_042.txt

subramanian_v_Page_293.txt

subramanian_v_Page_183.txt

subramanian_v_Page_076.txt

subramanian_v_Page_279.txt

subramanian_v_Page_107.txt

subramanian_v_Page_305.txt

subramanian_v_Page_151.txt

subramanian_v_Page_208.txt

subramanian_v_Page_047.txt

subramanian_v_Page_231.txt

subramanian_v_pdf.txt

subramanian_v_Page_153.txt

subramanian_v_Page_233.txt

subramanian_v_Page_280.txt

subramanian_v_Page_006.txt

subramanian_v_Page_230.txt

subramanian_v_Page_017.txt

subramanian_v_Page_217.txt

subramanian_v_Page_145.txt

subramanian_v_Page_067.txt

subramanian_v_Page_209.txt

subramanian_v_Page_133.txt

subramanian_v_Page_023.txt

subramanian_v_Page_169.txt

subramanian_v_Page_172.txt

subramanian_v_Page_216.txt

subramanian_v_Page_102.txt

subramanian_v_Page_003.txt

subramanian_v_Page_282.txt

subramanian_v_Page_020.txt

subramanian_v_Page_204.txt

subramanian_v_Page_179.txt

subramanian_v_Page_025.txt

subramanian_v_Page_058.txt

subramanian_v_Page_245.txt

subramanian_v_Page_203.txt

subramanian_v_Page_214.txt

subramanian_v_Page_126.txt

subramanian_v_Page_004.txt

subramanian_v_Page_291.txt

subramanian_v_Page_143.txt

subramanian_v_Page_241.txt

subramanian_v_Page_222.txt

subramanian_v_Page_012.txt

subramanian_v_Page_044.txt

subramanian_v_Page_292.txt

subramanian_v_Page_055.txt

subramanian_v_Page_054.txt

subramanian_v_Page_105.txt

subramanian_v_Page_229.txt

subramanian_v_Page_236.txt

subramanian_v_Page_097.txt

subramanian_v_Page_298.txt

subramanian_v_Page_290.txt

subramanian_v_Page_288.txt

subramanian_v_Page_250.txt

subramanian_v_Page_090.txt

subramanian_v_Page_185.txt

subramanian_v_Page_167.txt

subramanian_v_Page_251.txt

subramanian_v_Page_027.txt


Full Text





Xcenter[n] = (Xright[n] Xleft[n])/2 (Eq. 3-8)

This process is repeated for each row and all the center points are stored in an array. The

average of the center points is chosen as the current center of the path in the given field of view




Xcurrenteenter = (i Xenlter[i])/n (Eq. 3-9)


The current position of the vehicle center is chosen as the mid point of the image at column

240 i.e. half of the image width (480). The error of the vehicle from the path center is determined



Error = Xcurrenteenter 240 (Eq. 3-10)

This error in pixels is calibrated against real distances using prior calibration of the field of

view. This transformation is explained in a subsequent section. Once the error is determined, the

error is passed to the control system in the microcontroller using RS232 serial communication.

The flowchart of the entire vision algorithm is shown below.

Algorithm flowchart










OutputDebugString(" cor exit");

headlandflag=false;
pathend=false;
estr7h[0]=100;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);


else //regular

i flyaw<(he adl andangl e+ 1 60))

OutputDebugString("reg rit 180\n");
estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=201;
Gator.Write((BYTE *)estr7h,1);

else

OutputDebugString("reg exit");
headlandflag=false;
pathend=false;
estr7h[0]=100;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);


} //rev over


estr6h[0]='m'; //end of transmission data
Gator.Write((BYTE *)estr6h, 1);





-500 -



-40 -- -- -








-100IIIIII
II ~ ~~ 50 0010 20 5030










3234


An algorithm was developed to determine the tree locations from the data. The Figure 7-13

shows the data in 3-dimensional space. The search space for the algorithm, when analyzing data

in 3-dimensions is large. This results in the consumption of a large amount of computer

processing time. For real-time precise headland detection, this is significant. The aim is to locate

the trees with respect to the vehicle in the x-y plane. The location of the data in the vertical plane

is not important. Without loss of useful information, the 3-dimensional data is proj ected on to the

horizontal plane. The proj ected data is shown in the Figure 7-14.









CHAPTER 6
SENSOR FUSION USINTG FUZZY LOGIC ENHANCED KALMAN FILTER FOR
AUTONOMOUS VEHICLE GUIDANCE

Introduction

There is a current need in the Florida citrus industry to automate citrus grove operations.

This need is due to reduction in the availability of labor, rising labor cost and potential

immigration challenges. Autonomous vehicles would be an important part of an automated citrus

grove. Autonomous vehicles can operate accurately for a longer duration of operation than when

using a human driver. In a scenario where an automated citrus harvester is in operation, the

operator can monitor both the harvester and the vehicle instead of being solely the driver. An

autonomous vehicle with the ability to navigate in the middle of the citrus alleyways with an

accuracy of 15cm deviation from the center would be beneficial. Numerous autonomous vehicles

for agricultural applications have been described in the literature. The guidance system

developed by Noguchi et al. (2002) was for operation in paddy fields for transplanting rice

whereas the guidance system developed by Nagasaka et al. (2004) was tested in soybean fields.

The development of an autonomous vehicle guidance system for citrus grove navigation using

machine vision and laser radar (ladar) based guidance systems is discussed in Subramanian et al.

(2006). Both machine vision and ladar based guidance performed with a maximum error of

6.1cm in test paths. The machine vision based guidance system had also guided a tractor through

an alleyway of a citrus grove, keeping the vehicle visually in the center of the path with no

collisions with the trees. It was found that ladar based guidance was more accurate than vision

based guidance in well defined paths within a speed of 3.1m/s whereas the vision based guidance

was able to keep the vehicle in the middle of the path in several types of paths but less accurate.

The uneven tree canopy in citrus grove alleyways pose problems for guidance when using vision

and ladar separately. With vision based guidance configuration used in this research, for









Headland Turning Maneuvers

As mentioned before, the amount of time spent and the area required by the vehicle to turn

at the headlands is important. A headland turning maneuver can be performed as a series of the

following steps: 1) Determining the approach of the end of current row while navigating the

alleyway. 2) Precisely establishing the end of the row. 3) Turning into the headland. 4) Turning

into the next row. 5) Getting to a position to navigate the next row.

Determining the approach of the headland

The first step of determining the approach of the headland while navigating the alleyway is

performed using machine vision. Consider the Figure 7-5 which shows the image obtained from

the camera when the vehicle was in the middle of the alleyway (a) and the image obtained from

the camera when the vehicle was close to the end of the alleyway (b),(c).















Cs







Figure 7-5. Images obtained from the camera when the vehicle is navigating the alleyway










forward and backward location of the vehicle before and after reversing respectively. The

vehicle' s average speed was 1.8m/s while executing the turns. This speed was chosen as a

reasonable speed to safely navigate the headland and is visually similar to the speed at which a

human would drive the vehicle. Experiments were conducted for both left and right turns for

each maneuver. Therefore, four maneuvers were performed, namely the left u-turn, right u-turn,

left switch back turn and right switch back turn. Three repetitions were performed for each of the

runs. After the execution of each test run, the distance of the markers from the trunk of the tree,

about which the turn was executed, was measured. This measurement was taken manually using

a tape measure. From this measurement and known vehicle width, the distance of the vehicle' s

inner wheel, in the headland, from the same tree stump was calculated.

Results and Discussion

For the path navigation experiments conducted in the test track and the grove alleyway, the

average error, the rms error, standard deviation of the error and the maximum error were

calculated and are presented in the Table 7-2.

Table 7-2. Performance measures of the eGator based guidance system for path navigation
Path IAvg. Error (cm)l Std. Dev (cm) Max Error (cm)l RMS Error (cm)

Test Track 1.8 1.2 3.5 2

Grove 5 3.5 9 8
Alleya


Detailed discussion of the fusion based guidance system was presented in Chapter 3. As

the path navigation experiments presented here were a confirmation of the adaptation of the

guidance system to the new vehicle, only the results are presented. It is observed that the

performance of the guidance system using eGator in navigating the paths is similar to the

performance when using the tractor.










Py = R Sin( ) Cos(m)


Pz = R Sin(-) Sin(m)


(Eq. 3-42)


(Eq. 3-43)


, m, n)


Figure 3-76. Coordinate systems

(Px, Py, Pz) is the data point in the Cartesian coordinate system. The data obtained by the

scan is then filtered to keep the relevant data. It is known that the lateral distance of trees in the

row, from the ladar can range from a distance of 0.5m to a maximum of 5m. The ladar to the

vehicle's end is 0.5m. So, a tree limb is not expected to be closer than 0.5m during normal

operation. Distances longer than 5m correspond to trees located in adj acent rows. Therefore, the

filter was implemented such that any data beyond 5m and data less than 0.5m were removed.

After filtering, a sample plot of the data in the Cartesian coordinates is shown in the Figure 3-77.









researches. In practice, many problems exist in the real environment in which light conditions

change often and contrasts shift, as is often the case in agriculture.

Laser Radar (Ladar) for Autonomous Navigation

The operating principle of non-contact laser radar is based on time-of-flight measurement.

The ladar calculates the distance to the obj ect using the time of flight of pulsed light, i.e. the

length of time between sending and receiving the beam of light. An extremely short pulse of

light (infrared laser beam) is transmitted towards an obj ect. Part of the light is then reflected back

to the unit, a fraction of a second later. A rotating mirror deflects the pulsed light beam to many

points in a semi-circle. The precise direction is given by an angular sensor on the mirror in the

ladar. A large number of coordinates measured in this manner are put together to form a model

of the surrounding area's contours. A planar scanning ladar gives information of one plane. This

can be overcome by rotating the laser source to get a 3 dimensional view. Laser range sensors are

very accurate in measuring distance, with accuracy as high as 1 cm over a distance of 80 m. This

makes them suitable for guidance applications. Laser radar has been used for ranging and

obstacle avoidance. It has higher resolution than ultrasonic sensors, so it is more accurate and

requires less computation than vision. However its performance degrades with dust and rain, like

vision, and it is more expensive than an ultrasonic sensor.

Carmer and Peterson. (1996) have discussed the use of laser radar (ladar) for various

applications in robotics. Autonomous vehicle navigation was discussed as a promising

application for ladar, due to its ability to accurately measure position. Gordon and Holmes.

(1998) developed a custom built ladar system to continuously monitor a moving vehicle's

position. The system's performance was not very reliable due to underdeveloped technology at

that time. Current technologies have advanced to the point that ladar systems are readily

available, although they are relatively expensive compared to other alternatives. Ahamed et al.










3. Tumn the steering wheel to one end such that vehicle turns in the required direction.

4. Monitor the vehicle's heading and monitor for ob stacles in front of the vehicle using the
ladar.

5. When the vehicle has turned by an angle of 90 degrees with respect to the initial heading
angle before turning, stop the forward motion of the vehicle.

6. Tumn the steering to center position and reverse the vehicle a distance of 0.5m.

7. Tumn the steering to the end such that the vehicle turns in the required direction.

8. When the vehicle has turned by more than 150 degrees with respect to the initial heading
angle before turning, switch the algorithm to alleyway navigation mode using vision and
ladar.

9. Any lateral error in entering the new row is corrected using the alleyway navigation
algorithm.

It is to be noted that the turning maneuver algorithms are tuned for a grove with certain

dimensions of alleyway width. It is operable in groves with dimensions that are different by

approximately Im. But the algorithm is not suitable for groves that differ more. It is difficult to

navigate headlands using vision and ladar as headlands can consist of random obj ects and the

vision and ladar could not be modified for all possibilities. During tests in a grove in Immokalee,

Florida, it was found that the grove includes a large drainage pit on the ground at the end of each

row. In such a case, the turning maneuvers failed as the vehicle ends in the pit. Therefore, more

complex ladar based algorithms may be necessary to adapt the vehicle to such possibilities.

Open Field Navigation and Obstacle Detection

In the previous sections, methods were described that allowed the autonomous vehicle to

accurately navigate the alleyways of a citrus grove, detect headlands and tumn at the headlands

using u-turn and switch-back turn maneuvers and navigate subsequent alleyways. With these

capabilities, the autonomous vehicle possesses the ability to navigate an entire grove

autonomously. In large citrus producing farms, the groves can be arranged in the form of blocks

separated by open fields. With the previously described capabilities, after the vehicle completes









CHAPTER 7
HEADLAND TURNING MANEUVER OF AN AUTONOMOUS VEHICLE NAVIGATING A
CITRUS GROVE USING MACHINE VISION AND SWEEPING LADAR

Introduction

There is a need to automate grove operations in the Florida citrus industry and autonomous

vehicles form an integral part of this automation process. An autonomous vehicle can be used for

applications such as scouting, mowing, spraying, disease detection and harvesting. The

development of autonomous vehicle guidance systems using machine vision and laser radar for

navigating citrus grove alleyways was discussed in Subramanian et al. (2006). This paper

discusses the headland navigation algorithms.

Complete navigation of a citrus grove requires the vehicle to turn at headlands in addition

to navigating the alleyways. Headland in citrus groves is the space available at the end of each

tree row. This space is generally used for turning the vehicles navigating the rows. Headlands are

characterized by a ground space of at least 4.25m after the last tree in each row. The ground is

usually covered with grass. Beyond this space, there may be trees planted to act as boundaries of

the grove or there could be more open ground or some combination of both. An autonomous

vehicle operating in citrus groves needs to navigate the headlands to turn and navigate other

rows. Miller et al. (2004) have reported that that the space required to turn at the headland and

the time spent in turning, significantly affect the field efficiency. Hague et al. used vision based

guidance in combination with dead reckoning to guide a vehicle through crop rows. In this paper,

turning maneuvers to navigate the headlands of a citrus grove are discussed.

Materials and Methods

The vehicle used for developing the headland turning algorithms was the John Deere

eGator. The sensors used for developing the headland navigation algorithms are the camera, the

ladar, the IMU and the wheel encoder. The details of these sensors and their mounting location









Future Work

The development of the autonomous vehicle guidance for navigating citrus groves is

presented in this dissertation. Although all the obj ectives of this research were completed,

significant future work is required before the guidance system can be reliably used and

commercialized. This research on vehicle guidance encompassed a vast area and includes

numerous sub components for the operation of the overall guidance system. Each component of

the system invites significant further research. The choice of the vehicle in this research was

guided by the availability of the vehicle. Investigation is required to determine the suitability of a

vehicle for the application. The choice of control systems namely PID and Pure Pursuit were

based on previous research. Other control systems for navigating groves could be explored.

The machine vision algorithm presented in this work utilizes color based segmentation of

trees and path. The development of the algorithm was limited by the computer processing time

available when running multiple algorithms on a PC. Other machine vision methods such as

texture and shape of obj ects would make the segmentation more reliable. This choice is largely

guided by the processing power available and optimal programming techniques, which were not

the focus of this work.

Improved software utilization such as parallel processing could be investigated for this

purpose. Use of multiple processors is a suitable option for long term improvement of the

guidance system. Windows operating system used in this work runs numerous non-essential

operations in the background. This reduces the processing time available for real time operation

of the guidance system. A leaner real time operating system should be explored.

Obstacle avoidance ability was not pursued in this research. Within the grove alleyways,

the amount of space available for the vehicle to navigate is small. It is to be determined whether

obstacle avoidance is feasible within the alleyway. For navigating open fields, obstacle detection










N Way point
*B





Theta



Vehicle \


Figure 8-1. Illustration of vehicle navigating way points

The vehicle is heading in a direction towards a random point C with a heading angle m

with respect to the geographic north. The straight line distance required to reach B is 'd' and a

turn through an angle 'theta+m' is required to reach B. The Haversine formula gives d and theta



d =Rp (Eq. 8-1)

where R = radius of curvature of earth = 6378140m


p = 2 Tan' ( ~, J-) (Eq. 8-2)

dlat dlon
a = Sin2( ) + COs(latA) Cos(latB) Sin2( )(Eq. 8-3)
2 2

dlat = latB latA (Eq. 8-4)

dlon = lonB lonA (Eq. 8-5)

theta = mod [atan2 { Sinrdlon) Cnos/latB), Cos/latA) Sin/latB) Sin/latA) CoslatB)


Cos(dlon)), 2 xi ] (Eq. 8-6)

theta is positive for directions east of the geographic north and increases anti-clockwise

from 0 to 2 xi in radians. The direction 'theta' is the heading from location A to location B. To

steer the vehicle, it is also necessary to determine the current heading of the vehicle with respect










linguistic variables was performed at a scale from 0 to 1. A set of if-then rules were developed

by experimentation. Each rule received a corresponding normalized membership value from the

input membership function and thereby the linguistic variable. Based on the linguistic variables

chosen from each z', the corresponding rules are fired. The rule set is shown in the Table 3-5.

Table 3-5. Fuzzy logic rule set for divergence correction
z x z' oc Qx 98R


Negative Negative Zero Zero


Negative Positive Zero Positive


Positive Negative Positive Zero


Positive Positive Positive Positive



For example, consider the second row in the Table 3-5 where z'x is negative and z' Bc is

positive, this indicates that the error in position has reduced compared to the previous time step

but the vehicle is heading in the wrong direction, then the process noise for position qx is zero

and the process noise for heading q8R is positive. In the rule set, a single measurement value for

position, z'x, is given instead of the two different values, z' xe and z' XL This is because, the

sensor supervisor described earlier, picks one of the sensors for use in the filter when the other is

unreasonable. In the cases where one sensor is given higher priority, the position measurement of

that sensor alone is used in the rule set and in the cases where both sensors are selected by the

supervisor; the position measurement of ladar alone is used as it has lower measurement noise.

This reduction was performed to reduce the complexity of the rule set.










11


high
250 ~ high 250 inest
intensity
low
intensity o




Pixelen it poiinPx o"o
Figur 3-7 mae ndter re intensity poiears h etro h mg.A o





InteFigure 3-47, thee plos ho their greenchne intensity profiles ofnon-shetr f h iag.ad ownd



shadow images, respectively. In non-shadow images, there is low intensity on left side

corresponding to the trees on the left, high intensities in the middle corresponding to the path and


again low intensities in the right corresponding to the trees. For the shadow image, there is high

intensity on the left corresponding to one side of the trees and half the path, and low intensities

on the right corresponding to half the path and the right-side trees. It should also be noted that in

the shadow images, the low intensities are mostly below 50 and in non-shadow images, the low

intensities are mostly below 100.

In this method, it is initially assumed that there are many shadows in the image. A

threshold value corresponding to this is used; i.e., of the order of 50 and the image is

thresholded.










computer. Therefore, the hardware switch should be turned to autonomous mode only when both

the high level and low level software are operational.

Pure Pursuit Steering

The algorithm for the Pure Pursuit steering was presented in Chapter 3. The code segment

for the software implementation is presented in Appendix E. The algorithm is operational as

soon as the low level software is started. The input to the algorithm is the lateral error in pixel

coordinates from the high level controller. The output of the algorithm is the required steering

angle. The software implementation is as follows:

1. The lateral error of the vehicle is obtained in pixel coordinates.

2. The error is then determined to correspond to either a left turn or a right turn.

3. The error in pixel coordinates is converted to distance coordinates. The look-ahead distance
is chosen as 6m. At 6m, a calibration was made that 180 pixels in the image corresponds to
2.43m of lateral distance. This calibration is used to convert error in pixels to meters.

4. The error is used to compute the required radius of curvature of the vehicle using the
equation 3-35.

5. The steering angle is computed from the radius of curvature using equation 3-38 and 3-39.

6. The steering angle is sent to the CAN controller using CAN commands.

Headland Turning

The development of headland turning algorithms is discussed in Chapter 3. The algorithms

are implemented in the PC in the Visual C++ environment. The important code segments are

presented in Appendix F. This software is implemented as functions called from the vision based

alleyway navigation function, when the headland approach is determined using vision. When

approaching the headland, first the sweeping ladar algorithm is called to determine the exact

location of the headland. The sweeping ladar algorithm is implemented in the function

treedladarprocess(). The steps implemented in the function are as follows:









CHAPTER 2
LITERATURE REVIEW

Agricultural Autonomous Navigation Applications

Vehicles have been used in agriculture for many decades. They have been used for

applications such as plowing, planting, harvesting and spraying. These operations require the

operator to spend long and tedious hours in the field. Modern technology has facilitated the

automation of several agricultural operations thereby reducing operator fatigue and improving

productivity. An important part of this automation process includes the use of autonomous

vehicles.

Research has been conducted in developing autonomous vehicles for agriculture for many

years. Some of these researches are mentioned below. Many of these researches were conducted

in research laboratories and have not yet fully evolved into commercial products. The reasons are

numerous ranging from the economics to insufficient computer processing power. However, the

advancements in computer processing power and world economics pertaining to agriculture in

recent years have given a new hope to bringing autonomous vehicle from research laboratories to

commercial agricultural operations. Most of the research into developing autonomous vehicles in

agriculture is concentrated on vehicles for crop operations. Research in the development of

autonomous vehicles for orchards and citrus groves are few.

Guidance Vehicle

The autonomous vehicle can either be in the form of guidance system added on to a

commercial agricultural vehicle or a vehicle developed exclusively to aid in autonomous or

robotic operations. Blackmore et al. (2002) have listed the following requirements of an

autonomous vehicle for agricultural applications

* Small in size
* Light weight










Subramanian, V., T. F. Burks, and S. Singh. 2005. Autonomous greenhouse sprayer vehicle
using machine vision and laser radar for steering control. Applied Engineering in
Agriculture 21(5): 93 5-943.

Subramanian, V., T. F. Burks, and A. A. Arroyo. 2006. Development of machine vision and laser
radar based autonomous vehicle guidance systems for citrus grove navigation. Computers
and Electronics in Agriculture 53(2): 130-143.

Takahashi, T., S. Zhang, and H. Fukuchi .2002. Acquisition of 3-D information by binocular
stereo vision for vehicle navigation through an orchard. In: Proc. Automation Technology
for Off-Road Equipment, Chicago, Illinois.Paper 701P0509 pp. 337-346.

Tsubota, R., N. Noguchi, and A. Mizushima. 2004. Automatic guidance with a laser scanner for
a robot tractor in an orchard. In: Proc. Automation Technology for Off-road Equipment,
Kyoto, Japan. Paper701P1004.

Wu, H., M. Siegel, R. Stiefelhagen, and J. Yang. 2002. Sensor fusion using dempster Schafer
theory. In: Proc. IEEE Instrumentation and Measurement Conference, 1:7-12.

Yekutieli, O., and F. G. Pegna. 2002. Automatic guidance of a tractor in a vineyard. In: Proc.
Automation Technology for Off-Road Equipment, Chicago, Illinois.Paper 701P0502. Pp.
252-260.

Yokota,M.,, A. Mizushima, K. Ishii, and N. Noguchi 2004. 3-D map generation by a robot
tractor equipped with a laser range finder. In: Proc. Automation Technology for Off-road
Equipment, Kyoto, Japan.Paper 701P1004.

Zarchan, P., and H. Musoff. (Eds.), 2005. Fundamentals of Kalman filtering: A practical
approach. Progress in Astronautics and Aeronautics, 208. Virginia, USA, AIAA 2.

Zhang, Q., J.F. Reid, and N. Noguchi. 2000. Automatic guidance control for agricultural tractor
using redundant sensor. SAE Transactions, Journal of Commercial Vehicles, Vol. 108, pp
27-31.

Zhang, Q.,S. Cetinkunt, T. Hwang, Pinsopon, M. A. Cobo, and R. G. Ingram. 2001. Use of
adaptive control algorithms for automatic calibration of electrohydraulic actuator control.
Applied Engineering in Agriculture, Vol. 17(3).

Zhang, Q., and H. Qiu. 2004. Dynamic path search algorithm for tractor automatic navigation.
Transactions of the ASAE, Vol. 47(2): 639-646.









or negative values only indicate whether the decision is vision or ladar. Only the absolute values

are used to update R. m, is the area under the dashed line and dotted line for each. Therefore,

Ucrisp = (1)(1 .375)+(0)(0.45) / (0.45+0.975) = 0.75

This crisp value is the new value of covariance for vision in determining the path error.

Although the crisp value is positive, since the decision was between vision and both, the crisp

value is used to update vision only. The range of crisp values for the decision was chosen by

performing simulations and determining what values provide good results. Using the Fuzzy logic

supervisor described above, the suitable sensor at each location is chosen and the updates are

made in R.

Divergence detection and fuzzy logic correction

The use of a Kalman filter with fixed parameters has a drawback. Divergence of the

estimates is a problem which is sometimes encountered with a Kalman filter, wherein the filter

continually tries to fit a wrong process. Divergence occurs due to the inaccuracies in the

covariance values specified in the model. Divergence may be attributed to system modeling

errors, noise variances, ignored bias and computational round off errors (Fitzgerald, 1971).

Often, a white noise model is assumed for the process, while implementing the Kalman filter.

White noise models are not true models of the process for autonomous guidance applications. It

is often used to simplify the implementation. If a white noise model is assumed for the process,

divergence, if encountered, is difficult to correct. Mathematical methods such as altering the

rank of the covariance matrix can be used to reduce divergence. Abdelnour et al. (1993) used

fuzzy logic in detecting and correcting the divergence. Sasladek and Wang (1999) used fuzzy

logic with an extended Kalman filter to tackle the problem of divergence for an autonomous

ground vehicle. They presented simulation results of their approach. The extended Kalman filter









Microcontroller

For the rate of steering turn of the vehicle, a high sampling rate of the encoder was

anticipated. Initially, the encoder was interfaced using a National Instruments data acquisition

board interfaced with the computer. When initial laboratory tests were conducted to measure the

output from the decoder, the encoder shaft was manually rotated and the pulses were read. It was

found that high sampling rates could not be achieved using this interface setup. It was also

anticipated that the operating speed of the Windows operating system may not be fast enough for

high speed control. Therefore, the Tern microcontroller was used for low level high speed

operations. A microcontroller was used for high speed low level control operations. The

microcontroller used in this research was the 586 Engine controller board with a P50 expansion

board from TERN Inc. It is a C++ programmable controller board based on a 32-bit

100/133MHz AMD Elan SC520 microprocessor. Its important features are as follows: 32 PIOs,

7 Timers, 19 channel 12 bit ADC, 8 channel 12 bit DAC, 2 quadrature decoders, 8 opto-couplers,

14 high voltage IO, Serial port and Ethernet. The microcontroller was used for processing

encoder data and for controlling the servo valve for steering, by sending analog voltages. The

microcontroller was mounted in the tractor cabin below the PC in a closed box.
















Figure 3-16. Microcontroller










values from vision, ladar, yaw from the IMU and the speed from the speed sensor. The output of

the function is the fused error in pixel coordinates. The variables are intuitively named to be

similar to the variable names used in the algorithm development. The function is called from the

function OnPaint() when both vision and ladar or at least one of them is operational. The fuzzy

logic supervisor and divergence detector as implemented as a set of if then loops identical to the

explanation in the Chapter 3. The function is implemented as follows:

1. The variables are initialized. The measurement values for each time step are taken as input to
the measurement vector. The matrices are implemented as two dimensional vectors. The
vectors are implemented as arrays. If this function is called for the first time, the yaw angle is
taken as the initial heading angle of the vehicle. The process and measurement noise
covariance matrices are initialized.

2. The vision and ladar data are checked for the range. If the range is unreasonable, the
measurement noise covariance value is increased. This is performed using fuzzy logic if then
loops similar to the description in the Chapter 3.

3. The time update equations 3-17 and 3-18 are first executed to compute the state estimate and
the state estimate error covariance matrix.

4. The measurement update equation 3-20 is executed to compute the gain. The equations 3-21
and 3-22 are then executed to compute the posterior estimate of the state and the posterior
estimate of the state estimate error covariance matrix.

5. The fused error is obtained from the posterior estimate of the state. This error in pixel
coordinates is sent to the low level controller for steering.

6. The innovation vector is computed using equation 3-31. This is used as the input for
detecting divergence.

7. The process noise covariance matrix is updated if divergence is detected.

Large amount of matrix memory is used for this implementation. Therefore, the memory

allocation is made before starting the fusion routine and deleted before exiting the program.

Software Architecture for eGator Guidance

The hardware architecture of the guidance system for the eGator was presented in Chapter

3. The high level algorithms are implemented in a PC running Windows XP operating system.










i=i+ 2;
OutputDebug String("next coord\n");
}//go to next coordinate

if(i>2*n)//finished all

sprintf((char*) gpstext, "%d,%d\n",i,n);
OutputD ebug Stri ng(gp stext);
OutputDebugString("GPS navigation completed\n");
estr4[0]='k'; // start of transmission data
Gator.Write((BYTE *)estr4,1);

//Center the steering before quitting
estr5[0]=100; //hi
Gator.Write((BYTE *)estr5,1);
estr5 [0]=0;//lo
Gator.Write((BYTE *)estr5,1);

//end of transmission data
estr4[0]='m';
Gator.Write((BYTE *)estr4,1);

//quit navigation
estr4[0]='q';
Gator.Write((BYTE *)estr4,1);

else

dlat=numbers[i-1 ]-lati;
dl on-numb ers [i]i-l ongi;
a=sin((dlat/2)*pi/1 80)* sin((dlat/2)*pi/1 80)+cos(lati~pi/1 80)*cos(
numb ers [i 1 ]*pi/1 80)*sin((dlon/2)*pi/1 80)*sin((dlon/2)*pi/1 80);
c=2*atan2(sqrt(a), sqrt(1-a));

GPSdistance=6378 140*c;
GPSbearing=fmod(atan2(sin(dlon~pi/1 80)*cos(numbers[i-
1]*pi/1 80),cos(lati~pi/180)*sin(numbers[i-1 ]*pi/180)-
sin(lati~pi/1 80)*cos(numbers[i-l ] *pi/1 80)*cos(dlon~pi/1 80)), 2*pi);

// Shift bearing from (0 to pi & -pi to 0) to 0 to 2pi
GP Sbearing-pi -GP Sb hearing;
// this shift helps in matching the bearings range to yaw range values from IMU

GP SbearingD=GP Sbearing* 180/pi; //convert gps bearing to degrees
//start steering










similar at around 2m. This is because, the minimum distance occurred at the location when the

vehicle just exits the alleyway and begins to execute the headland maneuvers. For both the

maneuvers, the vehicle exited the alleyway at approximately 2m distance from the last tree

stump. The average time for executing the u-tumn is 14s and for the switch-back turn is 17s at an

average speed of 1.8m/s. For practical comparisons, the times are similar for both the maneuvers.

The time taken to cover a longer distance by the wide u-turn is similar to the time taken in

reversing by the switch-back turn. The slightly larger time for the switch-back tumn may be

attributed to the time taken by the vehicle in stopping and then reversing from rest. Visually, the

headland turning maneuvers were similar to human driving.

Conclusions

Headland turning maneuver algorithms were developed for an autonomous vehicle

navigating a citrus grove to execute at the headlands. Algorithms using machine vision and

sweeping ladar were developed to precisely detect the headland while the vehicle navigates the

alleyways. When the headlands were detected and reached by the autonomous vehicle, U-tumn

and switch-back turning maneuvers were developed to navigate grove headlands. The

performance of the headland detection algorithm and the execution of the turning maneuvers

were tested in a citrus grove. Experiments were conducted to test the performance of the turning

maneuvers. The average distance from the last alleyway tree, while executing the u-turn was

3.8m and while executing the switch-back turn was 2.7m. Switch back turning maneuver was

able to execute the turn within a smaller distance compared to the u-turn. However the average

time required to execute the u-turn was 14s and for executing the switch-back turn was 17s. With

the addition of the headland detection algorithms and the turning maneuvers to an autonomous

vehicle already capable of navigating the alleyways, the autonomous vehicle now possesses the

ability to navigate an entire grove.































To Mankind










include RS-232 serial communications, 12 Volt power, digital to analog converter, and

programmable look-up-tables for manipulation of the RGB image.

















Figure 3-10. Frame grabber board

A custom cable was attached to the video camera with the connector to interface with the

frame grabber. The software was written to communicate with the frame grabber and obtain

images to be processed to determine the path for the vehicle navigation.

Guidance Sensor: Laser Radar

The laser radar (ladar) system used in this study consisted of a Sick LMS-200 ladar sensor

(Sick AG, Waldkirch, Germany). The LMS-200 is a 180 degree one-dimensional sweeping laser

which can measure at 1.0/0.5/0.25 degree increments with a 10 mm range resolution and a

maximum range of up to 80 m. The LMS-200 can be operated at 3 8.4 KBaud using RS232

protocol, giving a sweep refresh rate of approximately 7 Hz or at 500kBaud using RS422

protocol, giving a sweep refresh rate of approximately 35Hz. For operation at 500kBaud, a

special high speed serial communication card was used. The ladar also consists of a 24V external

power supply to power the unit.










estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=201;
Gator.Write((BYTE *)estr7h,1);


else if(yaw<(headlandangle+70) && flagl==false) //regular

if(delayfl ag l==false)
{ Sleep(3 000);delayflagli-true; }
OutputDebugString("reg rit 90\n");
estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=201;
Gator.Write((BYTE *)estr7h,1);

else
{flag l=true;: } //90deg tulrn complete

//left turn-----------------------------------------
//full turn left until headlandangle-180
if(flagl==true) //extremes

if(flag2==false) //reverse

OutputDebug String("rev");
estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=203 ;
Gator.Write((BYTE *)estr7h,1);
{flag2=true; }

else //reverse over now turn

if(correction2!i=0) //extreme

if(yaw160) //160+200=360

OutputDebugString(" cor rit 180\n");
estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=201;
Gator.Write((BYTE *)estr7h,1);

else //completed so exit to alleyway mode










3.1m/s and an average error of 1.5cm at 1.8m/s were observed in the tests. The maximum error

was not more than 4cm at both speeds. Tests were then conducted at the same speeds in a citrus

grove alleyway. The average error was less than 10cm for both speeds. The guidance system' s

ability to navigate was verified in citrus grove alleyways and was found to perform well with the

vehicle remaining close to the center of the alleyway at all times. The developed fusion based

guidance system was found to be more reliable than individual sensor based guidance systems

for navigating citrus grove alleyways. The present system is able to reliably traverse through an

alleyway of a citrus grove. Additional work is necessary before the system can be ready for

production. Some of the future work would include adding the ability to navigate the headlands

and the ability to optimally navigate the entire grove.









if(fitter[left][i]==0 && fitter[left][i-1]!=0)
{ fitter[left][i]=fitter[left][i-1]; }
if(fitter[right] [i]==400 && fitter[right][i-1]! =400)
{ fitter[right] [i]=fitter[right] [i-1]; }


// goto skipper;

GetB oundary(l eft); // Line fitting left boundary

GetB oundary(right); // Line fitting right boundary

for(int h=MyHeight/2-100 ;h<=MyHeight-100; h++)


x[left]=al[left]+bl [left]*h;
wall[left] [h]=x[left]-50;

if(x[left]>0 && x[left]<400)

if(brick[left] [0]==-1)

brick[left] [0]=wall[left] [h];

else

bri ck[left] [ 1]=wall [left] [h];



x [right] =al [right] +bl [right]*h;
wall[right] [h]=x[right]+60;

if(x[right]>0 && x[right]
if(brick[right] [0]==-1)

brick[right] [0]=wall[right] [h];

else

bri ck[right] [ 1]=wall [right] [h] ;



if(wall[left][h]>=0 && wall[right] [h]<=MyWidth)









AUTONOMOUS VEHICLE GUIDANCE
FOR CITRUS GROVE NAVIGATION


















By

VIJAY SUBRAMANIAN


A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL
OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT
OF THE REQUIREMENTS FOR THE DEGREE OF
DOCTOR OF PHILOSOPHY

UNIVERSITY OF FLORIDA

2008





















A B
Figure 3-74. Ladar and motor assembly for sweeping. A) view 1. B) view 2.

The motor was attached to the ladar body such that when the motor shaft rotated, the ladar

was oscillated up and down about the horizontal plane. The ladar was swept from an angle of

15deg. above the horizontal to 30deg. below the horizontal at a speed of approx. 15deg/s (Figure

3-75). The range of angles was chosen such that the ladar sweep covered only the tree locations.

Angles beyond 15 degrees above the horizontal resulted in the ladar sweeping over the trees and

the angles beyond 30 degrees below the horizontal resulted in the ladar sweeping the ground.

The angles were also limited by the range of the mechanism linking the motor to the ladar. The

speed of rotation was a limit of the servo motor.








3 --' 150-.










Figure 3-75. Illustration of the ladar sweep









Desired position = (Right side tree boundary Left side tree boundary) / 2, for non-shadow

images (Eq. 3-11)



Desired position = Right side tree boundary Fixed distance, for images with shadow

(Eq. 3-12)

Error = Desired position current position (measured in pixels) (Eq. 3-13)

Pixel distance to world distance conversion. It is necessary to determine the path errors

in real world distances by converting pixel distances to real world distances. The camera is

positioned at a fixed angle, and so the look-ahead distance is fixed. A meter ruler was placed in

front of the tractor in the camera' s field of view. From the pixel length of the ruler seen in the

image, the pixel to world conversion was determined as

100 cm = 177 pixels

However, an obj ect in the lower half of the image looks bigger than if it were in the upper

half of the image. A correction factor of 10 cm is necessary depending on where the distance is

measured in the image. This variation was taken into account in the algorithm.

Error in position = Error in pixels x 100/177 (cm) (Eq. 3-14)

The real distance error information is then passed to the control system implemented in the

microcontroller. The control system uses this error to steer the vehicle to the center of the path.

The flowchart of the overall vision algorithms is given in the Figure 3-50.

The operating speed of the vision algorithm, that is, the process of getting an image and

processing it to obtain the path navigation error is performed at a rate of about 15Hz-20Hz.

















0' I F O I I
Im 2m 3m -3cm -1 0 0.2 2cm
X Decision


Figure 6-4. Fuzzy logic for sensor supervisor. A) Fuzzy logic architecture. B) Membership
functions.

The input membership function relates the input x to the linguistic variables reasonable,

unreasonable and zero (Figure 6-4 B). The meaning of the linguistic variables is literally whether

the distance from the tree row is reasonable or not. A zero membership for the ladar indicates the

presence of obstacle in the path and for vision, the end of the row. x takes the value of zero for 0

to Im, Im to 3m for reasonable and any value greater than 2m for unreasonable. The ranges were

intuitively chosen to be the possible values. Normalization of the linguistic variables was

performed at a scale from 0 to 1. During the presence of an obstacle, the ladar takes precedence

till the obstacle is avoided. The rule set for the supervisor is shown in Table 6-1. Each rule

received a corresponding normalized membership value from the input membership function x.

Based on the linguistic variables chosen from each x, the corresponding rules are fired. From the

rules fired for a given instant, the output membership function is chosen. The membership value

was used to find the crisp value for the output. The output membership function decision is

shown in Figure 6-4 B, which relates the rules fired to the linguistic variables vision, ladar and

both. These variables represent the decision made as to whether vision or ladar is given higher

priority or if both should be given equal priority. Defuzzification was done using the center of

gravity method. The crisp values of the decision, is taken as the measurement noise covariance

value for that sensor. For example, if a decision of ladar is obtained with a crisp value of 1, cr,










The image size obtained from the video camera is 640pixels by 480pixels. The image was

not resized. It is an RGB image with three values for each pixel, corresponding to red, green and

blue values. Also the image is mapped from the bottom left as the reference. This reference is

based on the Windows bitmap standard adopted in the software. The Figure 3-36 shows the

image coordinates used in the algorithm.

O 480 640 480







0,0 > 640,0
Figure 3-36. Image Coordinates

The vision algorithm is required to work effectively in real time. The operating speed of

the algorithm is dependent on the speed of the vehicle. For this proj ect, only one PC with the

2.4GHz processor was used to process all the high level algorithms. It was also anticipated that

ladar based segmentation would also be required to run in parallel with vision. In the future,

more algorithms using sensor fusion, headland turning, path planning and obstacle detection

were also anticipated to be implemented. Therefore to run all these algorithms in parallel in a

single processor and to have a real time vision algorithm, color based segmentation was selected

as the method of segmentation, as it has been found to work well in other researches for

navigating tractors through crop rows. Color based segmentation is also relatively fast compared

to other complex methods such as texture based classification methods.

Thresholding. Thresholding was the first process in the vision algorithm. No

preprocessing such as filtering was done to the base image. This was because, preprocessing

increased the processing time without offering any significant improvement in segmentation.


















1/s 356 1245 14 178 23







Figure 5-13. Performance ofthen machine vision geuidnce in thwe stracingh path.s A) 1.8 m/. ) .

flowers m/s. C)ee h aa ro s1.%hihra .c hntaeiga 4.4 m/s. h

Figure 5-14o shgowsthe a pefrmnanc ofzwer the laser radar guidane syse at te thre different t

sp egreds Oneagation the PID control isdriving the erort aroundctn zero Th average erorat the

slowest speda (1.6s cm)e isalms thatlf th error t the highst speed (3.1 cm). Laser radaaruis a moe

accurtemeauigdvc than h machine vision rudnessesutinag ivn lower itervackin errrs atg theed to



mac, hine vision agridcsthm wasrun t 20z, werea the laser radar rudnefrsesh rate was only 6cHz at






vision guidance system is not severely affected. By using a high speed serial communication

card (500 KBaud), the laser radar refresh rate can be increased and therefore its performance at

speeds 4.4 m/s and higher, can be improved. None the less, the RMS error was only 3.5 cm at the

fastest speed.



20 20
15l 15*1 4





Travel Distance (cm) A Travel Distance (cm) B









For the explanation of the ladar based path finding, consider the ladar mounted on the

tractor and scanning the hay bale track in front of the tractor from 0 to 180 degrees at 0.5 degree

increments. The Figure 3-52 shows the radial distance as a function of the angle obtained from

the ladar when scanning the test path with hay bale boundaries. The two maj or discontinuities or

depressions in the plot correspond to the radial position of the hay bales with respect to the ladar.

Error Calculation. For determining the exact location of the hay bales, a data window of

length 4 distance measurements is moved across the distance data from 0 to 180 degrees. When

all of the four positions in the window encounter a drop below a threshold value, it is decided

that that there is hay bale corresponding to the left boundary. After finding one hay bale, the

window is moved until it finds another hay bale, which corresponds to the right side boundary.

The sliding window also eliminates any point noise. Since the ladar is placed at the center of the

tractor, the distance of the tractor center from the hay bales can be calculated as

Distance = Radial Distance at the hay bale cosine (Angle at that point) (Eq. 3-15)

Using the above formula, the error of the tractor position in the path is calculated as

Error = (Distance from right boundary Distance from left boundary) / 2, in cm (Eq. 3-16)

As with the vision algorithm, the error information determined above is sent to the control

system implemented in the microcontroller using RS232. Using this error, the steering control

system tries to get the tractor back to the center of the path. The flowchart of the ladar based

guidance algorithm is shown in the Figure 3-53. The operating speed of the process of collecting

the ladar data and processing the data to find the path navigation error is about 7Hz. This speed

is limited by the RS-232 serial communication standard.









As mentioned earlier, the ladar refresh rate is approximately 38Hz. From the motor sweep

speed and the ladar refresh rate, data available per degree is approximately


-2.53 = 2 full sweeps per degree


Therefore, a ladar scan is obtained every half a degree at this sweep rate of the motor. In

this sweeping ladar algorithm, the motor is allowed to execute one sweep of the ladar from up to

down. The entire data obtained from the ladar during this period is stored in an array. Using the

calculations mentioned above, the total number of scans of the ladar is then matched with 0.5

degree positions of the sweep angle. One sweep of the ladar by the motor at 15deg/s takes

approximately 5s. Synchronization of the ladar and sweep angle consumes a longer time. That is,

if the motor was steeped to move to 0.5 degree increments and then allowed to acquire a full

ladar scan at each angle, the time consumed to execute one full sweep would be much higher.

The overall allocation of scans to sweep angles eliminates the need to synchronize the sweep

angle positions with the ladar scan and thereby saves time. Also, it was experimentally observed

that the error introduced due to 0.5 degree allocation of scans is not significant to warrant

synchronization.

As mentioned earlier, the data obtained from the ladar is in polar coordinates with the ladar

being the origin. For this algorithm, it is intuitive to use data in Cartesian coordinates. Consider

the coordinate system shown in the Figure 3-76.

Let P be a data point at a radius R from the origin subtending angles m and n with the z

and y axis respectively. To convert this to the Cartesian coordinate system, the following

equations were used.


Px = -R Cos(-) Cos(m) (Eq. 3-41)










* Exhibit long-term sensible behaviors
* Capable of receiving instructions and communicating information
* Capable of coordinating with other machines
* Capable of working collaboratively with other machines
* Behave in a safe manner, even when partial system failures occur
* Carry out a range of useful tasks

These requirements are also applicable to vehicles navigating citrus groves. While current

research has not progressed to the level of meeting all these criteria, there has been significant

progress. In addition to these criteria, the following requirements may also be expected

* Low cost
* Attractive for a farmer to adopt
* Off-road vehicle performance and reliability

An autonomous vehicle is expected to possess the following basic components to be

operational: vehicle platform, steering system, guidance system, sensors and a control system.

Guidance by Remote Control

Remote control or tele-operation has already been used in several branches of robotics.

Visual information of the operating environment is relayed to a remote location where the

operator controls the vehicle. Murakami et al. (2004) used a CCD camera, a gyroscope and a

GPS as the sensors and wireless IP protocol for communication, for tele-operation of an HST

drive agricultural vehicle. A joystick was used for tele-operation.










A B
Figure 2-1. Remote controlled vehicle (Murakami et al., 2004). A) Teleoperated Vehicle. B)
User interface.






























Continue Moving
window and look
for discontinuity


Find path center as
mid point between
hay bales


Figure 3-53. Ladar algorithm flowchart

Serial communication between PC and microcontroller

RS 232 serial communication was used to send error information from the PC to the

microcontroller based on the calculations from vision and ladar algorithms. Communication was









The switch-back turn is implemented similar to the u-turn function. The difference is that

the initial turn in the opposite direction is not made and after the vehicle has turned by 90

degrees with reference to the initial saved angle, the vehicle is reversed for a specified distance

before the turn is continued. The 'Gator.Write' command is for sending the data to the low level

controller using serial communication.

Open Field Navigation and Obstacle Detection

The development of the open field navigation and obstacle detection algorithms were

discussed in the Chapter 3. The algorithms are implemented in the PC in the Visual C++

environment. The open field navigation uses a GPS based navigation algorithm for steering the

vehicle whereas the obstacle detection algorithm is based on the ladar and is used to stop the

vehicle when an obstacle is present. The important sections of the code are presented in

Appendix G.

Open Field Navigation

In the GPS based navigation, the algorithm determines how much the vehicle is to be

steered to reach a way point. The Pure Pursuit steering discussed earlier uses error information in

pixel coordinates to compute the steering angle of the vehicle. Therefore, the steering determined

by the GPS based navigation algorithm converts the required steering to error information and

passes on to the Pure Pursuit steering algorithm in the PC104 computer. The software is

implemented as follows:

1. The way point file is opened and read to acquire the latitude and longitude coordinates of the
way points. The values are stored in an array. The file is then closed. This is implemented in
the function ReadGpsCoord().

2. To navigate the way points, the function GPSnavigate() is called. The function includes a
loop which runs till all the way points have been reached.

3. If all the points have not been reached, the distance to the way point, 'GPSdistance', and the
required heading of the vehicle 'GPSbearing' are first calculated.










* If the window in a location consists of more than 5 data points, note the location of the
window

* The location indicates the presence of a tree or a part of a tree

If two consecutive tree containing locations differ by more than 250cm, the difference is

due to the presence of headland. It is then determined that the headland starts just after the last

tree location prior to this gap. Based on the location of the trees in the alleyway, the exact

location of the headland is determined. The algorithm and clustering ensure that the headland is

found precisely and the determination is accurate to less than 50cm beyond the last tree in the

alleyway. The result of clustering and headland determination is shown in the Figure 7-15.














Figure 7-15. Tree clustering and headland determination

Vision and ladar based headland determination

The vision based headland detection and sweeping ladar based establishment of headland

were described above. As mentioned, the sweeping ladar process takes approximately 5s to

execute a sweep. It is not practical to execute the sweep while navigating the entire row.

Therefore, sweep is started only when the vision based algorithm detects the headland. Then the

sweeping ladar algorithm takes over. The dead reckoning is no longer required. On further tests

conducted in the grove using the above algorithms, it was also found that a stationary ladar

mounted at an angle of 10 degree to the horizontal, is able to detect four trees similar to the









It was decided that the tests could be carried out in an open area in the university campus.

Hay bales were used to form an artificial boundary of a path and tests were conducted there. Hay

bales were selected because it was expected that the tractor might not navigate very well during

the initial stages and might run over any artificial boundary. The hay bales provided a compliant,

yet 3D boundary that could be sensed by the vision and ladar sensors.

Computer vision algorithms had to be developed to segment the path from the

surroundings. Two algorithms had to be developed, one for the citrus grove environment and one

for testing with hay bales. As discussed earlier, the camera was mounted on top of the tractor at

an angle of 450 with the horizontal. The camera was pointed just in front of the tractor hood. This

aided in looking right in front of the tractor. The height of the camera gave a good Hield of view

for the vision algorithms. The mount of the camera and ladar are shown in the Figure 3-34.


Figure 3-34. Camera and Ladar mounted on top of the tractor cabin










Experimental Procedures


Experiments in the Test Track

All previous experiments with the autonomous vehicle guidance system were conducted

using the tractor as the vehicle. In this stage of the proj ect, the guidance system has been adapted

for use with a John Deere eGator as the vehicle. The performance of the guidance system on the

eGator had to be first tested. Therefore, it was decided that guidance performance tests would be

conducted in a test track before testing in an actual grove. A test track was constructed in an

open area near Lake Alice in the University of Florida campus. The track consisted of boundaries

on either side of a path. The path boundaries were constructed using black flexible plastic

cylindrical containers. The test track is shown in the Figure 7-16, with the eGator positioned in

the path. Gaps were incorporated in the path boundaries to mimic gaps between citrus trees.














Figure 7-16. Vehicle in the test track

Although this test track does not ideally mimic the grove conditions, it is a way to ensure

reasonable path navigation ability of the guidance system on the vehicle prior to testing in the

grove. If testing was directly conducted in a citrus grove and the guidance system fails, damage

to trees could occur. Testing in test tracks ensures reasonable operation of the guidance system

and minimizes the probability of accidents in the grove. The boundaries of the test path are

relatively inexpensive compared to trees. The choice of black obj ects for the boundary is to aid










Passino, K.M, and S. Yurkvovich. 1998. Fuzzy Control. California, USA.: Addison Wesley
Longman, Inc.

Paul A. S., and E. A. Wan. 2005. Dual Kalman filters for autonomous terrain aided navigation in
unknown environments. In: Proc. IEEE International Joint Conference on Neural
Networks, 5: 2784-2789.

Petrinec, K., Z. Kovacic and A. Marozin, 2003. Simulator of multi-AGV robotic industrial
environments. In: Proc. IEEE Inter. Conf. on Industrial Tech. Vol. 2: 979-983.

Qiu, H., Q. Zhang, J.F. Reid, and D. Wu. 2001. Modelling and simulation of an electrohydraulic
steering system. International Journal of Vehicle Design, Vol. 26, pp. 161-174.

Rasmussen, C. 2002. Combining laser range, color, and texture cues for autonomous road
following. In: Proc. IEEE International Conference on Robotics and Automation, Vol. 4:
4320-4325.

Reid, J.F. 2004. Mobile intelligent equipment for off-road environments, In: Proc. Automation
Technology for Off-road Equipment, Kyoto, Japan.

Rouse, B., and S. Futch, 2004, Start Now to Design Citrus Groves for Mechanical Harvesting,
HS974, IFAS Extension, University of Florida, Gainesville, FL,.

Rovira-Mas, F., Q. Zhang, J. F. Reid and J. D. Will. 2002. Machine vision row crop detection
using blob analysis and the hough transform. In: Proc. Automation Technology for Off-
Road Equipment, Chicago, Illinois. Paper 701P0509 pp. 327-336.

Runkler, T., M. Sturm and H. Hellendoorn. 1998. Model based sensor fusion with fuzzy
clustering. In: Proc. IEEE International Conference on Fuzzy Systems. 2: 1377-1382.

Sasladek, J.Z, and Q. Wang. 1999. Sensor fusion based on fuzzy Kalman filtering for
autonomous robot vehicle. In: Proc. of IEEE International Conference on Robotics and
Automation, 4: 2970-2975.

Senoo, S., M. Minor, and S. Funabiki. 1992. Steering control of automated guided vehicle for
steering energy saving by fuzzy reasoning. In: Proc. IEEE Industry Applications Society,
Vol. 2, pp 1712 -1716.

Shin, B.S.,S.H. Kim, and J.U. Park. 2002. Autonomous agricultural vehicle using overhead
guide. In: Proc. Automation Technology for Off-Road Equipment. Chicago, Illinois. Paper
701P0509 Pp. 261-269.

Stombaugh, T.S., E.R. Benson, and J.W. Hummel. 1999. Guidance control agricultural vehicles
at high field speeds. Transactions of the ASAE. Vol. 42(2).










used to perform this update for two parameters of Q namely qx and q BR. The fuzzy logic system

is shown in the Figure 3-61.


zV- Z. Inference O
Fuzzifier Engine Defuzzifier cx
Z XL
RuleR


Figure 3-61. Fuzzy logic implementation to correct divergence

The xc, xL and B parameters of the innovation vector z' are the inputs to the fuzzy logic

system. Among the various parameters used in the filter, these are the three parameters whose

process noise is less precisely known as they depend entirely on the algorithm developed. The

other two parameters of heading from IMU and the velocity are obtained directly from the

sensors and so their process noise is not expected to change significantly during operation. The

output of the fuzzy logic system are the updates to Q namely qx and q8R which correspond to

path error and required heading.


Negative Positive





-1500 -500 00 500 1500


Figure 3-62. Input membership function

The Figure 3-62 shows the input membership function. The ranges of the input are from -

15% to 15% of the values of the innovation vector change from zero. These maximum and

minimum values were chosen by trial and error by performing simulation. The inputs are

converted to linguistic variables negative and positive as shown. These linguistic variables

indicate the nature of the input which is either negative or positive. Normalization of the











3-37 R, G, B color plots of hay bales (X axis: Intensity-Range: 0-255, Y axis: no. of
pixel s) ................. ...............87.................

3-38 R, G, B color plots of grass path (X axis: Intensity-Range: 0-255, Y axis: no. of
pixel s) ................. ...............87.................

3-39 Thresholded image ........._._.._......_.. ...............88....

3-40 Cleaned image............... ...............88.

3-41 Boundary Isolated image .............. ...............89....

3 -42 Lines fit to the boundaries............... ...............9


3-43 Machine vision algorithm flowchart ........._._. ...._... .. ...............92..

3 -44 Typical images for navigation in citrus grove .............. ...............93....

3-45 Picture collection in the grove .............. ...............94....

3-46 R, G,B intensity plot for ground and trees in grove alleyway .............. .....................9

3-47 Images and their green intensity profie across the center of the image. ................... ........96

3-48 Thresholding. A)Shadow image.. ............ ...............97.....

3-49 Classified images without and with shadows .............. ...............98....

3-50 Machine vision algorithm flowchart ................. ...............100........... ...

3-5 1 Calculating the required heading of the vehicle ................. ...............101........... .

3-52 Radial distance plotted against angle (180 degree at 0.5 degree increments) .................102

3-53 Ladar algorithm flowchart ................. ...............104......... .....

3-54 Serial Protocol ................. ...............105................


3-55 Illustration of Kalman fi1ter operation ................ ...............108..............

3-56 Vehicle in the path with the state vector variables ................. ................. ......... 11

3-57 Fuzzy logic implementation ................. ...............116...............

3-58 Membership function for linguistic variables ................. ...............117..............

3-59 Crisp output membership function............... ...............12

3-60 Illustration of inference for the example. ................ ................. ........ ........ 121










speeds. Stombaugh et al. (1999) have found that the location of a GPS receiver when used as a

position sensor is critical for calculating the phase in frequency response tests of the control

system. They found that mounting a GPS receiver above the front wheels gives more accurate

navigation, whereas most research results report mounting a GPS receiver on top of the cab.

GPS based guidance alone cannot be used for positioning in citrus applications, as it could

give errors when the vehicle moves under tree canopies, which block the satellite signals to the

GPS receiver. In groves, trees could block the satellite signals. Moreover, a system using GPS

for guidance requires that a predetermined path be given to the computer for it to follow. This

requires initial mapping. Hence, when a GPS receiver is the sole sensor for positioning, it cannot

be used instantly in any grove. Significant time has to be spent in mapping its path.

Machine Vision for Autonomous Navigation

Machine vision is the ability of a computer to "see." A machine-vision system employs one

or more video cameras for obtaining images for the computer to interpret. Machine vision is one

of the popular sensing technologies for robotics applications today. Machine vision technology

can be used to automatically guide a vehicle when the path to be traversed is visually

distinguishable from the background. Vision guidance has the advantage of using local features

to adjust the vehicle navigation course.

The vision system's performance in guiding the vehicle is comparable to highly accurate

laser radar (Burks et al., 2004). Typical applications in vehicle guidance include guiding tractors

for crop cultivation and guiding an autonomous vehicle in a greenhouse. This sensing system has

been found to work well for fully grown crops, but it has not performed well for crop harvesting

with low or sparse crops.










In the grove, alleyways were chosen for conducting the headland turning experiments. The

alleyways were chosen such that the last set of trees in the rows had good canopy which could be

detected using the ladar. In a chosen alleyway, the vehicle was started approximately in the

middle of the alleyway in autonomous mode. The type of turn, i.e. u-turn or switch back turn, to

be executed at the end of the alleyway and the direction of turn, i.e. right turn or left, were

specified in the program (Figure 7-19).

Headland Trees
Al eyHT dland VeilePt
3/ 1;27 Headland
Vehicle Path
2 8 Alley Trees










A B

Figure 7-19. Headland turning maneuvers in the grove. A) u-turn. B) switch-back turn.

The vehicle was allowed to navigate the rest of the alleyway using sensor fusion based steering,

detect the headland using the ladar, execute the specified turning maneuver in the headland using

the algorithm described above, get into the next row and start navigating the next row. The

vehicle was manually stopped after navigating a distance of approximately 20m in the new row.

The path that the vehicle took in the headland was marked using markers. The markers were

placed manually. The markers were placed at equal angular intervals of 22.5 degrees with the

tree stump as the center. This resulted in a total of 9 markers being placed, representing the path

taken by the vehicle in the headland. The illustration of the marker locations are numbered in the

Figure 7-19. Two additional markers were used for the switch-back turn to mark the extreme








































Figure 3-49. Classified images without and with shadows

In the Figure 3-49, the left image is a segmented non shadow image and the image on the

right is a segmented shadow image. Based on the positioning of the camera, it is observed that

the lower half of the image contains information corresponding to a distance from 10ft to 20ft in

front of the vehicle. It was decided that the useful information for keeping the vehicle in the

center of the path is in this lower half of the image. Hence processing is restricted to the lower

half of the image. This reduces the processing time considerably.

Error calculation. The camera is mounted on the center of the vehicle' s chassis. The

current lateral position of the vehicle corresponds to the image center. That is, the image

image 0I idthr
coordinate of the current position is 320.










As this method is analogous to human driving, we can choose a predetermined look-ahead

distance 'p'. For this proj ect, the look-ahead distance was chosen to be 2.89m from the front of

the vehicle. For alleyway navigation, the error of the vehicle from the path center is 'q' at the

look-ahead distance p For non-alleyway navigation, 'q' can be the required lateral movement of

the vehicle at the look-ahead distance. This value of 'q' is obtained from the machine vision and

ladar algorithms for path segmentation. Using these values of 'p' and 'q', R can be calculated

from the above equation. R is the required radius of curvature of the vehicle to get the vehicle to

the required point. In the CAN based steering system of the e-Gator, steering has to be specified

as steering angles and not as radius. Therefore, R has to be converted to steering angle.

A calibration experiment was conducted to find a relationship between the radius of

curvature and the steering angle. In the experiment, the eGator was taken to an open area.

Steering angle was kept constant by sending CAN commands to the steering controller. The

vehicle was moved forward by manually stepping on the accelerator. For a constant steering

angle, the vehicle moved in a constant circle. Markers were placed on the ground at the inner

wheel radius at different locations. The diameter of the circle was manually measured using a

tape measure. The steering angle was varied to obtain different radius of curvature. The

illustration of the experiment is shown in the Figure 7-2 and the experimental results are shown

in the Table 7-1.





Vehicle






Figure 7-2. Illustration of experiment for steering angle calibration with radius of curvature













3.5m 1m '





Figure 6-7. Hay bale track profile. A) Dimensions. B) Photo.

Experiments were conducted to check the robustness and accuracy of the fused guidance

system in the test path at 1.8m/s and 3.1m/s. The vehicle was manually navigated to the starting

position at one end of the test path. Once positioned, the vehicle was started in autonomous

control and allowed to navigate down the path to the other end thereby navigating a distance of

53m. Three repetitions were performed for each vehicle speed to ensure replication resulting in a

total of six trial runs. The path traveled was marked on the soil by a marking wheel attached to

the rear of the tractor. The error measurements, along the vehicle centerline, were taken

manually using a tape measure at regular intervals of Im. This interval was chosen to get

enough samples to correctly estimate the path error. Measurements at smaller intervals were

redundant.

After achieving satisfactory performance, wherein the vehicle stayed close to the center of

the path without hitting the hay bales, in the test path, tests were conducted in orange grove

alleyways at the Pine Acres experimental station of the University of Florida. The average

alleyway path width was approximately 3.5m and the length of the alleyway was approximately

110m with about 30 trees on either side. A typical alleyway is shown in Figure 6-8, where the

approximate tree height is 1.5m to 2m. Figure 6-9 shows the field of view of the camera and the

ladar when navigating the citrus grove alleyway.









CHAPTER 1
INTTRODUCTION

This dissertation is about the development of an autonomous vehicle guidance system and

methods for navigating citrus groves. The need for automation in citrus groves is first introduced.

The obj ectives of this work in addressing the need are then specified. A glance of the research

done by other researchers in similar areas is given. The details of the autonomous vehicle

guidance system and algorithm development are discussed. Finally, the results of the research are

presented and the areas of further improvement are indicated.

Florida Citrus

Citrus is a genus of flowering plants. For the common man, it includes orange, tangerine,

grapefruit, lime, lemon and their many varieties. Florida, the place where this work was done,

supplies more than 80% of citrus in USA and is today a $9 billion industry (Hodges et al., 2001).

There are also other maj or citrus producers in the world such as Brazil, Mexico, Israel and Italy.

For all these producers, there are large numbers of people in the world who consume citrus. The

market therefore, for citrus, is huge. Citrus area per farm in Florida was around 0.5 sq. Km in

1997 (Hodges et al., 2001) and is currently the state's largest agricultural produce. Florida is

also the world' s leading producer of grapefruit and is the second largest producer of oranges in

the world following Brazil. 95 percent of all Florida oranges are squeezed into orange juice

(Brown, 2002). The total economic impact of the citrus industry on the state of Florida exceeds

$8 billion annually.

Automation and Robotics in Citrus Groves

Over the years, large scale farms have been replacing the smaller ones. More people are

moving to other occupations. So, one farm owner replaces several owners. This leads to larger

fields to plant and harvest. Laborers often work for several weeks in a farm, so facilities have to










adaptable to the grove conditions. Filtering of the ladar data was required due to uneven tree

canopy.

The experiments using independent machine vision and ladar for navigating the grove

alleyways provided insights into the capability of these sensors. Machine vision had a long field

of view and navigation was not as accurate as the ladar. At the end of the alleyways, machine

vision capability was lost due to the longer field of view beyond the alleyway. Ladar had shorter

and more accurate field of view but was inaccurate for short durations in the ab sence of a tree on

the alleyway. To utilize the advantages of the sensors, fusion algorithms were developed to fuse

the information from these two sensors. Kalman filter was chosen as the primary fusion method.

Fuzzy logic was used to select the appropriate sensor for the end of the alleyway and while

missing trees are encountered. Inertial measurement unit and a speed sensor were added to aid

the fusion process. Experiments were conducted in test tracks to ensure proper operation of the

guidance system. Experiments were then conducted in citrus grove alleyways. Fusion based

guidance was found to perform better than individual sensor based guidance. The navigation of

the center of the path was also more accurate.

From the experiments using the tractor, it was found that transporting the tractor to the

grove for testing was expensive. To further the abilities of the vehicle, speed control, reversing

and start-stop capability was important. A golf cart type vehicle, eGator was donated to the

research by John Deere. The eGator was an electric vehicle. The vehicle was suited for easy

control of the steering, speed and adding star-stop capability. It was also convenient to transport

the vehicle. However, documentation on the vehicle was minimal. The vehicle was reverse

engineered to be used for autonomous operation. The entire guidance system was adapted for the










From the previous sections, it is known that the image width used for vision is 400 pixels.

To maintain the communication protocol with the PC104 computer algorithm, the range of -30

degrees to +30 degrees is mapped to 0 to 400 pixels of image coordinates.

Pixel value = 400 (steering angle/60) (Eq. 8-7)

This pixel value of the required steering is sent to the Pure Pursuit steering control.

As described in the algorithm above, when suitable steering is executed, the distance

between the vehicle and the way point reduces and the vehicle reaches the way point. It is known

that the accuracy of the DGPS receiver is 12 inches. During initial developments of the

algorithm, the vehicle was determined to have reached the way point, when the distance between

the vehicle and the way point was zero. It was observed that the uncertainties in the accuracy of

the receiver, the terrain variation and small inaccuracies in steering often result in the vehicle

reaching a point very close to the way point of the order of centimeters away from the way point.

In such cases, if the algorithm requires that the distance be zero, to translate the vehicle by a few

centimeters to the way point, the vehicle has to execute circles or spirals to get to the way point.

This is illustrated in the Figure 8-3.



Path






SWay Poi
Vehicle







Figure 8-3. Path taken for translation of the vehicle to the way point









caveat with the vision based headland detection is that the headland is not always composed

entirely of trees or open space. A sample image is shown in the Figure 7-9.


















Figure 7-9. Sample image where the vision based headland detection can fail

In the above image, a single tree is present in the headland along with an artificial obj ect.

In such a case, the vision based algorithm fails to precisely detect the headland. Presence of

random obj ects makes the vision algorithm less accurate at determining the distance to the

headland. In the previously described method, the vision based headland detection with dead

reckoning helped the vehicle reach the last tree of the alleyway. With the failure modes of the

vision based detection, the estimate of the headland may not be enough to reach the last tree or

may take the vehicle beyond the last tree. To overcome this problem, a ladar based algorithm is

described as follows.

Precisely establishing end of the row using sweeping ladar

The currently used mount angle of the ladar is such that it is able to detect only the closest

tree to the vehicle. It was proposed that if the ladar is rotated through a range of angles, many

more trees could be detected in front of the vehicle. The precise distance measuring capability of

the ladar would help in establishing the end of the row. In this method, a digital servo motor










12


PVP









PVB


lr-
I O '
II
I

Pressure relief valve
Pressure reduction valve for pilot oil supply
Pressure gauge connection
Plug, open centre
Orifi ce, closed cen tre
Pressure adjustment spool
Plug, closed centre
LS connection
LS signal
Shuttle valve


1 7 '$15
Main spool
LS pressure limiting valve
Shock and suction valve~, PVLP
Pressure co~mpensator
LS connection, port A
LS connection, port B
S uctio n valIve, PVLA
Load drop check valve
Pilot oil supply for PVE
Max. oil flow adjustment screws for
ports A and B
Sauer Danfoss PVG 32 valve


Figure 3-7. Servo valve showing the main parts (Source:
documentation)










if(i> 1)

if((meanlocationy [i]-meanlocationy [i-1])>=250)

turn=meanl ocati ony [i 1]+51 ;
break;


leftmax=5 00;rightmax= 500;
whil1e(me anl ocati ony [i]i+ 50<=j )

if(meanlocationx[i]+50<500) //left side tree

if(leftmax<500 && leftmax { leftmax=meanlocationx[i]+50; }
else if(leftmax==500)
{ leftmax=meanlocationx[i]+50; }

else if(rightmax>500 && meanlocationx[i] meanlocationx[i]>5 00) //right side
{ rightmax=meanlocationx[i]; }
else if(rightmax==5 00 && meanl ocati onx [i]i>5 00)
{ rightmax=meanlocationx[i]; }

if(turn==0 && i
else
{break;}


if(leftmax==500)
{ leftmax=0; }
if(rightmax==500)
{ rightmax=1000; }
if(k>0)
{ break; }

if(leftmax==0&& rightmax== 1000)

k=k+1;
if(j==250)
{ tur=j-245; }

else if(i==m)










//initialize ground distance to ladar
if(ob stgrnd==0)


for(int i=220;i>140;i--) / scan

if(y_di st[i]i
//infer found obstacle
ob stacl e_sighte d=tru e;

//Check if the inference is true
//Check if 5 succesive points agree with the inference
for(int k= 1;k<= 5;k++)

if(y_di st[i -k]>ob stgrnd- 50)

ob stacl e_sighted=false; //di sagree
break;



//If obstacle found, stop checking left
i f(ob stacl e_si ghte d==true)//de cided

ob stacl efl ag=true;
estr4 [0]='k'; // start of transmission data
Gator.Write((BYTE *)estr4,1);
estr4[0]=0;
Gator.Write((BYTE *)estr4,1);
estr4[0]=204;
Gator.Write((BYTE *)estr4,1);
estr4[0]='m'; //end of transmission data
Gator.Write((BYTE *)estr4,1);
break;
































Co .: --isl~ D
Figure 3-48. Thresholding. A)Shadow image. B) Non shadow image. C) Thresholded shadow
image. D) Thresholded non-shadow image.

Figure 3-48 shows how a shadow image and non-shadow image are segmented by using a

low threshold. After thresholding, the number of pixels classified and the locations of classified

pixels in the image are calculated. If the assumption that it is a shadow image was true, a lot of

pixels are classified as trees in one half of the image, corresponding to the shadow image profile.

This way, trees on one side have been found. Then the algorithm proceeds as with the hay bales,

but finds trees only on one side. Then the vision algorithm tries to keep the vehicle at a fixed

distance of half the aisle width from one side which has been identified as trees.

If the assumption was not true, then not many pixels are classified as trees on one side.

Then it is concluded that there is not much shadow in the image and a new threshold value,

higher than the previous value, is used. Now the trees on the two sides are found leaving out the

path (Figure 3-49). The algorithm follows the rest of the procedure as with hay bales. The

algorithm finds the path center between the trees on either side.
































0 10 20 30 40 60 60
Time(sec)
Figure 3-32. Simulated step response with the initial tuning parameters (10mph, Kp=1.08, Kd =
1.25)

*The parameters were further tuned and an integral gain was added for removing the steady
state error.

*The controller was tuned for zero steady state error and low overshoot. A large settling
time of the order of 20 sec was acceptable for the vehicle. This is because, if the controller
were tuned for a small settling time, the steering was very sensitive to very small errors
and caused the wheels to chatter.

The optimum PID gains were found to be:

Proportional Gain, Kp = 1

Derivative Gain, Kd = 4

Integral Gain, Ki = 0.09

When the control system was implemented in the tractor, the same parameters determined

for the simulation were used to control the vehicle and no significant further tuning was required.

























Figure 3-39. Thresholded image

Cleaning or filtering the image. As observed, the thresholded image has a lot of noise

where lots of isolated pixels on the path are classified as hay bale because of the higher intensity

of some regions without grass. This isolated pixel noise needs to be cleaned. For this, a 20 pixel

by 20 pixel window is created and moved through the image. In each window, the number of the

white pixels is counted. If more than 75% of pixels falling inside the window are classified as

white, then the window is determined to fall on a hay bale. Therefore, the remaining black pixels

in that window are changed to white. If less than 75% of pixels falling in the window are white,

then the pixels in the window are determined to be path and all the white pixels are changed to

black. This process is performed for the entire image. A cleaned image is shown below. The

cleaning process has removed the noise and formed clearly defined hay bale obj ects or path.












Figure 3-40. Cleaned image









and the information from the sensors was collected for 30 seconds. For the camera, the vision

algorithm (Subramanian et al., 2006) was run for 30 seconds and the path error and the heading

angle determined was recorded. For the ladar, the ladar path segmenting algorithm (Subramanian

et al., 2006) was run for 30 seconds and the path error measurements were recorded. For the

IMU, the sensor was run for 30 seconds and the heading angle measured by the sensor was

recorded. For the speed sensor, the vehicle was run at 3 different constant speeds and the speed

sensor measurement was collected for 30 seconds. The measurement noise covariance matrix,

denoted by R, was determined from these recorded measurements and was found to be

a, 0 0 0 0 1.07 0 0 0 0
0 o, 0 0 0 0 0.15 0 0 0
R = 0 0 a, 0 0 =I 0 0 0.0017 0 0
0 "n,,O 0 0.0001 0



where the variance of the error in lateral position of the vehicle, denoted by az, was

determined from the vision algorithm measurements, the variance of the error in lateral position

of the vehicle, denoted by a,, was determined from ladar algorithm measurements, the variance

of the error in heading, denoted by a,, was determined from vision algorithm, the variance of

the error in heading, denoted by Ga,,,,,,was determined from the IMU measurements and the

variance of the error in velocity denoted by &,,, was determined from the speed sensor

measurements. xc and 6 c are measurements from the same camera. However, the vision

algorithm uses different regions of the image to estimate them. XC, XL, B IMU and v are

measurements from different sensors operating independent of each other. These measurements

were assumed to be statistically independent. Therefore, the different sensor measurements were

assumed to be statistically independent of each other. Hence the covariance values are zero. The











-500 -

-400 tn-------------------!---- --!------!-------











1 & I I I



50 00 5020020030


Figure 3-78. 3-dimensional ladar data proj ected on the horizontal plane

It should be noted that multiple points at a location with different z values are overlapped

in the Figure 3-78. The data space consists of Icm xlcm bins. To locate the location of the trees,

a clustering algorithm is used. The clustering algorithm is as follows:

* Scan the data space using a 50cm x 50cm window in 50cm increments

* If the window in a location consists of more than 5 data points, note the location of the
window

* The location indicates the presence of a tree or a part of a tree

If two consecutive tree containing locations differ by more than 250cm, the difference is

due to the presence of headland. It is then determined that the headland starts just after the last

tree location prior to this gap. Based on the location of the trees in the alleyway, the exact

location of the headland is determined. The algorithm and clustering ensure that the headland is





144









were described earlier. The camera is mounted looking down at an angle of 10 degrees with the

horizontal and the ladar is mounted looking down at an angle of 15 degrees with the horizontal.

The camera, ladar and the IMU are interfaced with the 2.4GHz CPU running Windows operating

system. This is the same computer that was used on the tractor. All high level algorithms for

processing guidance sensor information and guidance algorithms are implemented in this

shoebox computer. The speed control and the steering control are implemented in the PC 104

computer described earlier. The steering control system used in the autonomous steering of the

tractor was the PID control, in which lateral position error of the vehicle was given as input to

the controller and steering was suitably controlled to minimize the error. In alleyway navigation,

the specification of the lateral position of the vehicle in the path can be described as error of the

vehicle from the path center. For headland navigation, there is no specified path that the vehicle

must take. Therefore it is not practical to specify a path error in all conditions. Also, the

development of classical controllers such as PID, a mathematical model of the system is useful

for tuning the controller parameters. For the tractor, this model was experimentally determined.

Development of a similar model for the eGator is time consuming. Therefore, a new type of

steering controller was explored which does not require error specification or mathematical

model, which was the Pure Pursuit steering.

Pure Pursuit Steering

Pure Pursuit steering mechanism has been used for many years in the Carnegie Mellon

NavLab autonomous vehicle proj ects and has been reported to be robust. The method is simple,

intuitive, easy to implement and robust. It is analogous to human driving in that humans look a

certain distance ahead of the vehicle and steer such that the vehicle would reach a point at the

look-ahead distance. The Pure Pursuit method is similar to the human way of steering. Pure

Pursuit is a method for geometrically calculating the arc necessary for getting a vehicle onto a










row, from the ladar can range from a distance of 0.5m to a maximum of 5m. The ladar to the

vehicle's end is 0.5m. So, a tree limb is not expected to be closer than 0.5m during normal

operation. Distances longer than 5m correspond to trees located in adj acent rows. Therefore, the

filter was implemented such that any data beyond 5m and data less than 0.5m were removed.

After filtering, a sample plot of the data in the Cartesian coordinates is shown in the Figure 7-13.


Figure 7-13. Swept ladar data in Cartesian coordinates

It is observed from the figure that the data corresponding to different trees are clustered

together. It is also observed that only four consecutive trees in front of the vehicle can be

detected with confidence. Although the range of the ladar is 80m, the trees further away

contribute very few data points to be clearly detected.









An accuracy of exactly reaching the way point is not practical in the physical system.

Therefore, by trial and error in tests, it was chosen that if the vehicle reaches a point within a

distance of 50cm from the way point, the way point is determined to have been reached by the

vehicle.

Obstacle Detection

The obstacle detection capability should allow the vehicle to detect obstacles such as

humans, animals and location variable obj ects such as vehicles, while navigating open fields. For

the scope of this dissertation, a simple obstacle detection method using ladar was developed. It is

assumed that the way points for GPS based navigation were chosen such that the vehicle would

not be expected to get tree limbs or fixed obj ects such as thin poles in its path.

For the development of the algorithm for detecting obstacles, the first step was to analyze

the data from the ladar while navigating the open field. As mentioned earlier, the ladar scans a

plane from 0 degree to 180 degrees and obtains radial distance to obj ects in front of the ladar.

The radial data is converted to Cartesian coordinates using the geometrical formula

Frontal distance = (radial distance)(Sin (scan angle)) (Eq. 8-8)

Lateral distance = (radial distance)(Cos (scan angle)) (Eq. 8-9)

The same formula was also used in the algorithm for ladar based grove alleyway

navigation. The Figure 8-4 shows the plot of the frontal distance obtained from the ladar against

the scan angle when there is no obstacle in front of the vehicle.

From the chart, it is seen that the closest distance in front of the vehicle at 90 degree angle

corresponds to the ground directly in front of the vehicle. The ground corresponds to a distance

of approximately 6m frontal distance from the ladar. For scan angles away from the central angle

of 90 degrees, the distance corresponds to ground locations away from the center of the vehicle.





















Figure 2-4. Greenhouse sprayer with ultrasonic sensor (Singh and Burks. 2002)

Wireless/Sensor Networks

Sensor networks are a relatively new development for vehicle positioning. A set of sensors

is located at several places along the path and they communicate with the vehicle's computer to

decide the path to be traveled. This requires permanent positioning of sensors at regular intervals

on the Hield. This is similar to using a beacon or landmark, but is not advisable in an orchard,

since the foliage may block the field of view. Wireless LAN has been tried as a method of

communication between the vehicle and base station to get feedback on position accuracy

(Nagasaka et al., 2002). Microwave systems can perform the same j ob effectively (Matsuo et al.,

2002), but they require frequent calibration. Wireless LAN could be used more effectively for

real time path planning. An operator sitting in the office could send information about the next

alleyways to be traversed.

Guidance Using Inertial Sensors

An inertial measurement unit, or IMU, usually consists of six inertial sensors- three linear

accelerometers and three-rate gyros. These sensors combine to give the vehicle's pitch, roll and

yaw. Often a magnetic compass is also included in the system to locate the direction.

Geomagnetic direction sensor (GDS) was the direction tracker when GPS was not available.

GDS has given errors due to the high magnetic Hield present around the vehicle (Mizushima et al,









BIOGRAPHICAL SKETCH

Vijay Subramanian graduated high school from Kendriya Vidyalaya in 1998. He received

his Bachelor of Engineering degree in electrical and electronics engineering from Annamalai

University in 2002. He received the Master of Science degree in electrical and computer

engineering in 2004 and the Master of Science degree in agricultural and biological engineering,

specializing in robotics, in 2005, both from the University of Florida.









noise, denoted by u(k) is assumed to be a Gaussian with zero mean with standard deviations

estimated experimentally with the different sensors.

Filter gain

The filter gain G is the factor used to minimize the posteriori error covariance P.

G(k) = P(k+1) HT (H P(k+1) )HT + R )1 (Eq. 6-4)

where R = measurement noise covariance matrix

To determine R, the vehicle was set stationary in the middle of the path, each sensor

(except the speed sensor) mounted on the vehicle was turned on independently and the

information from the sensors was collected for 30 seconds. For the camera, the vision algorithm

(Subramanian et al., 2006) was run for 30 seconds and the path error and the heading angle

determined was recorded. For the ladar, the ladar path segmenting algorithm (Subramanian et al.,

2006) was run for 30 seconds and the path error measurements were recorded. For the IMU, the

sensor was run for 30 seconds and the heading angle measured by the sensor was recorded. For

the speed sensor, the vehicle was run at 3 different constant speeds and the speed sensor

measurement was collected for 30 seconds. The measurement noise covariance matrix, denoted

by R, was determined from these recorded measurements and was found to be

a, 0 0 0 0 1.07 0 0 0 0
0 o, 0 0 0 0 0.15 0 0 0
R = 0 a, 0 0 I=I 0 0 0.0017 0 0
0 0 0 0,,,,O 0 0 0 0.O001 0
0 0 0 0 a,, O 0 0 0 0


where the variance of the error in lateral position of the vehicle, denoted by as, was

determined from the vision algorithm measurements, the variance of the error in lateral position

of the vehicle, denoted by a,, was determined from ladar algorithm measurements, the variance





* 0.23Kg, 2inch diameter

* Max speed 3000rpm at high load

* Output: Two square waves in quadrature


Channel


Von

AVOL


VoH

Vot


Figure 3-13. Output from the encoder shows two waves in quadrature

As shown in the Figure 3-13, by measuring the phase shift between the two square waves,

the direction of rotation can be calculated. The number of pulses after the last counted pulse

gives the amount of rotation.



















Figure 3-14. Encoder mounted on the tractor axle





































Obsacl


the frontal distance drops down to approximately 450cm from the ground data which was


approximately 6m in the previous scan shown above. This large change in data corresponds to


the presence of an obstacle in front of the vehicle. This can also be interpreted as the obstacle


cutting the laser to the ground, thereby showing a smaller distance (Figure 3-85).


2000
E
U
8
S
ti 1500
o
I
rr
1000



500


0 20 40 60 80 100 120 140 160 180 200
Scan Angle (deg.)

Figure 3-84. Ladar scan data with obstacle present in front of the vehicle


No Obstacle Obstacle


6m


Ground


Figure 3-85. Illustration of the obstacle detection using ladar


The obstacle detection algorithm is based on the above analysis of data described above.


The algorithm is as follows:


*Prior to operation in autonomous mode, when the vehicle is stationary, obtain the ladar
scan radial distance corresponding to 90 degree. Determine this distance as the ground
distance from the ladar.












GPS Based Navigation

From the experiments, the distance of the vehicle width' s center from the way point was

reported as the error in reaching the way point. The distances were averaged over the three

repetitions and over all the way points and reported as the average error for the GPS based

navigation for the experiment. The results are provided in the Table 8-1.

Table 8-1. Error of the vehicle for GPS based navigation experiments

Way point
Trial 1 2 3 4 5 6 7 8

S1 31 55 39 21 38 18 46 12

8 .E 2 45 43 15 6 17 23 54 26

3 27 34 24 43 40 31 35 29



From the results, it can be seen that the error is distributed among all the way points. The

average error over all way points was 31.75cm. The errors cannot entirely be attributed to

navigation of any particular way point or terrain change near one way point. As mentioned

earlier, the DGPS receiver is accurate to 12inches or approximately 30cm as specified by the

manufacturer. Also, the GPS based navigation algorithm considers a way point as being reached

if the vehicle is less than 50cm away from the way point. With the accuracy of the receiver,

accuracy less than 30cm cannot be expected for all test runs. Considering the accuracy of the

DGPS receiver and the allowable error in the algorithm, the average error of 31.75cm seems to

be of good accuracy. However, it should be noted that the accuracy in navigation is variable on

several factors such as the number of satellite signals being received by the DGPS receiver, the

errors in DGPS measurements due to motion of the vehicle, errors due to uneven terrain and


Results and Discussion









The segmentation algorithm used originally for path determination for headland

navigation, segments the entire image for trees and pathway. When the segmentation algorithm

is run on the above images, the resulting images are as shown below.


























Figure 3-70. Segmented images

It is observed from the segmented images that when the vehicle is close to the end of the

path, the upper half of the segmented image is predominantly either white or black depending on

whether trees are absent or present in the headland. For the case when the vehicle is in the

middle of the alleyway, the upper half of the image is both black and white without any color

being predominant. This difference in the upper half of the image for close to headland cases and

middle of alleyway cases is used in the vision algorithm.

The headland determination vision algorithm uses two fixed boxes, large/green and

small/red as shown in the Figure 3-71.










Table 7-1. Calibration of radius of curvature of vehicle with steering. angle
Steering Angle (Degrees)

10 15 20 25 30

Left 10.6 9.9 7.46 5.8 4.4
Radius
(m) Right 7.9 6.4 5.33 4.6 3.8




Experiments were conducted for steering angles from 10 degree to 30 degrees at 5 degree

intervals. The lowest angle chosen was 10 degrees because smaller angle had much larger radius

that caused the vehicle to travel out of the experimental area. The maximum angle of 30 degrees

is the physical limit of the vehicle's steering system. Experiments were conducted for both left

and right turns. The experimental values were plotted in the chart shown below.


+ Left
aRight
-Linear (Left)
Linear (Right)






y = -0 33x + 14 232


y = -0 2x + 9 606


15 20
Steering Angle (Degrees)


Figure 7-3. Calibration curve between radiuses of curvature of vehicle with the steering angle










et al., 2005) was used to perform this for two parameters qx and qB R. The fuzzy logic

architecture and membership functions are shown in Figure 6-5.



zV- Z. Inference O
.n in Defuzzifier
Z XL~ Fuzzifier



A

Negative Positive Zero Positive





0-1500 -500 00 500 150 OX=0 X=4
Z'(Xc XL, B c $ 0=.


Figure 6-5. Fuzzy logic for correcting divergence. A) Fuzzy logic architecture. B) Membership
functions for updating Q

The xc, xL and B parameters of the innovation vector z' are the input to the fuzzy logic

system. The ranges of the input are from -15% to 15%. These values were chosen by trial and

error by performing simulation. Normalization of the linguistic variables was performed at a

scale from 0 to 1. Based on the above argument for correction, a set of if-then rules were

developed by experimentation. Each rule received a corresponding normalized membership

value from the input membership function z'. Based on the linguistic variables chosen from each

z', the corresponding rules are fired. The rule set is shown in Table 6-2. For example, consider

the second row in the Table where z'x is negative and z' ec is positive, this indicates that the error

in position has reduced compared to the previous time step but the vehicle is heading in the

wrong direction, then the process noise for position qx is zero and the process noise for heading

q8R is positive. In the rule set, a single measurement value for position, z'x, is given instead of










presence of trees in the vicinity blocking the signal. Some of the low errors of 6cm or some of

the high errors of 54cm may be attributed to the vehicle navigating a subsequent way point in a

direction of motion resulting in error or to DGPS accuracy at a particular instant of time.

Therefore, the individual errors and average error are only an indication of the validity of using

the DGPS receiver to navigate the open Hields and may not be taken as a measure of accuracy of

the guidance system. The errors also indicate the amount of permissible error that is expected to

be available for the vehicle to navigate the open Hields. While navigating Hields between citrus

groves, the way points should be chosen to allow sufficient error or if higher accuracy is

required, ladar or vision based navigation could be used to augment the guidance performance.

Obstacle Detection

From the experiments conducted for the obstacle detection, it was observed that the

guidance system was able to detect both the obstacles during every test run. Both the obstacles

were detected at a distance of approximately 5m in front of the vehicle. The mounting angle of

the ladar was such that the ladar scan could detect the ground at approximately 6m. Therefore,

when the obstacles crossed the ladar scan prior to the ground, they could be detected. It was also

observed that the vehicle remained stopped until the obstacles were removed. These observations

confirm the ability of the obstacle detection algorithm to detect obstacles of the size of humans

or boxes. It is expected that that animals such as cows or goats would also be detected due to

their size. However, the obstacle detection system is not expected to detect tall grasses or thin

poles, due to the algorithm specifications. Extensive tests using various types of obstacles are

required before the ability of the system to detect different type of obstacles can be confirmed.

As the obstacle detection algorithm is based only on ladar, it is uncertain what kind of obj ects the

laser beam can reflect off of. A more robust obstacle detection system could include vision in

combination with the ladar. However, addition of vision could cost significant processing time.












Behavior-based control .............. ...............44....

Optimal control .............. ...............44....
Non-linear control .................. ...............45..
Sensor Fusion Applications in Navigation .............. ...............45....
Pure Pursuit Steering Control .............. ...............47....
Headland Turning Applications............... ..............4
Path Planning ........._..... .....___ ...............50....
Current Commercial Systems ........._._ ...... .__ ...............51...


3 EXPERIMENTAL METHODS .............. ...............52....


Sensors, Control and Hydraulic Architecture of the Tractor ........._._ ......._. ..............52
Guidance System Hardware Architecture .............. ...............52....
V ehicle ............... ...... ...............53...
Autonomous Steering ............ _...... ._ ...............55....
Servo Valve .............. ...............58....
Guidance S ensor: Camera .............. ...............61....
Frame Grabber Board ............_...... .__ ...............62...
Guidance Sensor: Laser Radar .............. ...............63....
Encoder ............ ..... .._ ...............64...

Com puter ............ ..... .._ ...............66...
M i crocontroller ............ ...... .._ ...............67..

Amplifier Circuit .............. ...............68....
DGP S Receiver............... ...............69
Power Supply.................... ..............6
Inertial M easurement Unit ........._... ...... ..... ...............70....

Speed Sensor .............. .. ........ .. .. ...............7
Experimental Approaches and Theoretical Development ........._._._......____ ..............71
Vehicle Dynamics, Calibration and Steering Control of Tractor ............... ........._.....71
Valve calibration .............. ...............71....
Encoder calibration .............. ...............73....
Vehicle dynamics .............. ...............74....
Control system ................. .... ............7
Vision and Ladar Independent Steering .............. ...............83....
Vision for path with hay bales............... ...............85.
Al gorithm fl owchart ............ ..... .._ ............... 1....
Vision for citrus grove.................. ...............92
Laser radar (ladar) for path detection ................. ............. ............. ......10
Serial communication between PC and microcontroller............... ............10

Operating speed of the guidance system ..............._ ...............105........._...
Sensor Fusion Based Autonomous Steering ................_ ........._ ............ ......10
Performance of independent sensor based autonomous steering ................. .........106
Need for sensor fusion based autonomous steering .............. .....................0
Kalman filter for autonomous guidance ........._.._........ ...... .... ........__. ........10
Reliability factor of primary guidance sensors in the Kalman filter and fuzzy
logic sensor supervisor ........._.._.. ...... ...............115....
Divergence detection and fuzzy logic correction ........_.._......... ..............._..122










1. The ladar data is converted from 3d polar coordinates to 3d rectangular coordinates using
equations 3.41, 3.42 and 3.43.

2. The relevant data is cropped to eliminate unreasonable distances in the x,y, and z directions.

3. To reduce computation time, the data in the 3d Cartesian space is projected on the 2d
Cartesian plane.

4. The data in the 2d is treated similar to an image to determine the locations of the trees.

5. From the location of the trees, the headland location is precisely determined.

When the vehicle is at the end of the alleyway, the turning functions are called. These are

implemented in two functions HeadlandU() for the u-turn and Headland3pt() for the switch-back

turn. The choice of whether u-turn or switch-back turn is performed is specified in the program

by the user. The input to the two functions is the yaw angle from the IMU and the output is the

steering command to the low level controller specified as error in pixel coordinates. The

implementation of the turns is based on the algorithm development presented in Chapter 3. The

u-tumn is implemented as follows:

1. The machine vision algorithm is turned off during the headland turn. Initialization of
variables is performed. The angle at which the vehicle exits the alleyway is saved. The tumn
will be performed such that the vehicle tumn is referenced to this saved angle.

2. Corrections are then calculated. This can be explained using an example. It is to be noted that
the IMU outputs yaw angles with a range of 0 to 360 degrees in the clockwise direction. If
the vehicle exits the alleyway at an angle of say 30 degrees and if an anti-clockwise u-turn is
to be performed, the Einal angle output angle from the IMU after the tumn is performed would
be 210 degrees, whereas the mathematical calculation in software would be 30-180 or -150
degrees. To avoid this difference due to the cyclic nature of the data, the appropriate
corrections are calculated.

3. If a left u-turn is required, as soon as the vehicle exits the alleyway, a right turn is made until
the vehicle turns by a specified angle. Note that the vehicle also moves forward while
turning. This tumn in the opposite direction is done to execute a wide u-tumn.

4. Then, the left turn command is given. The yaw angle is continuously monitored. When the
yaw angle is close to 180 degree with reference to the original saved angle as reference, the
control is switched to vision and ladar based alleyway navigation functions. The alleyway
navigation functions make the vehicle enter the alleyway and continue navigating the next
alleyway .










Table 6-3. Performance measures obtained from the experiments conducted in test track


`ir


~LY"4L/ ~iy ~21Y'- ~4~IC)Y ~/ \rV ~LL~.


1 6 11 16 21 26 31 36 41 46 51
Distance (m)


Lr1--~;Vlc-


1 6 11 16 21 26 :
Distance (m)


31 36 41 46 51


Figure 6-10. Path navigation error of the vehicle in the test track. A) 1.8m/s. B) 3.1m/s.

Performance measures obtained from the experiments conducted in the citrus grove

alleyway is shown in Table 6-4 and the error plot is shown in Figure 6-11. The average error at

the two speeds is less than 10cm and the maximum error is less than 22cm for an average










degree of steering angle and the range +31Idegree to +90degree of the vehicle turn is mapped to a

single value of +30 degrees of steering angle (Figure 3-81).









-30~ s +30


-90 Turn of vehicle +90







Figure 3-81. Angle ranges of steering and vehicle turn

From the previous chapters, it is known that the image width used for vision is 400

pixels. To maintain the communication protocol with the PC104 computer algorithm, the range

of -30 degrees to +30 degrees is mapped to 0 to 400 pixels of image coordinates.

Pixel value = 400 (steering angle/60) (Eq. 3-50)

This pixel value of the required steering is sent to the Pure Pursuit steering control.

As described in the algorithm above, when suitable steering is executed, the distance

between the vehicle and the way point reduces and the vehicle reaches the way point. It is known

that the accuracy of the DGPS receiver is 12 inches. During initial developments of the

algorithm, the vehicle was determined to have reached the way point, when the distance between

the vehicle and the way point was zero. It was observed that the uncertainties in the accuracy of

the receiver, the terrain variation and small inaccuracies in steering often result in the vehicle

reaching a point very close to the way point of the order of centimeters away from the way point.










the two different values, z' xe and z' XL This is because, the sensor supervisor described earlier,

picks one of the sensors for use in the filter when the other is unreasonable. In the cases where

one sensor is given higher priority, the position measurement of that sensor alone is used in the

rule set and in the cases where both sensors are selected by the supervisor; the position

measurement of ladar alone is used as it has lower measurement noise. This was done to reduce

the complexity of the rule set.

Table 6-2. Fuzzy logic rule set for divergence correction
z x z oc qx 98R
Negative Negative Zero Zero
Negative Positive Zero Positive
Positive Negative Positive Zero
Positive Positive Positive Positive

From the rules fired for a given instant, the output membership function is chosen. The

membership value was used to find the crisp value for the output. Defuzzification was done

using the center of gravity method (Passino and Yurkovich, 1998). The crisp values of the

process noise covariance are then updated in the process noise covariance matrix Q. The range of

crisp values was chosen by trial and error by performing simulations and determining what

values provide good results.

For the parameters, qoc and qv, a linear relationship was followed such that qH, and qv

were increased proportional to z'1-z of 6c and v respectively. The speed of the vehicle obtained

from the speed sensor was used to scale the output to the steering control. Higher speed requires

more turn and vice versa. The transfer function between the required lateral displacement of the

vehicle, to reduce the error to zero as determined from the fusion algorithm, and the steering

angle is given by equation (6) and the gains were determined experimentally for speeds 1.8m/s,

3.1m/s and 4.4m/s (Subramanian et al., 2006).










example, with a forward facing camera, branches closer to the vehicle than the camera' s field of

view are not accounted for in the vision based guidance; navigating locations near the end of the

row is not feasible with the vision system configuration as the last trees go out of the field of

view of the camera. Ladar mount configuration used in this research, with a wider field of view

and close range is useful for guidance in these situations. However, in a given alleyway, it is not

uncommon to find a few trees absent. In such cases, ladar based guidance gives erroneous paths.

These two guidance sensors provide complementary information for guidance. This paper

presents the research further undertaken to fuse the information from these sensors to make the

composite information from them more robust and make the overall guidance system more

reliable. The development of the Kalman filter for fusion is first discussed followed by a

description of the fuzzy logic system to augment the filter. Finally the experiments conducted to

test the performance of the fused guidance system are described.

Machine vision is a useful sensing methodology for guidance. One particular

implementation has been used in this research (Subramanian et al., 2006). The method is easily

adaptable from a custom designed test track to the citrus grove environment. Its long range aids

in faster decision making before a hard turn is to be performed or for faster obstacle detection

and is quite robust to minor variations in the path boundaries. However, it lacks the centimeter

level accuracy of laser radar in detecting the distance to close range path boundaries or obstacles.

This accuracy is useful in keeping the vehicle accurately positioned in the path. However, this

accuracy makes the information noisy if there are minor variations in the path boundaries. This

was a drawback in adapting the ladar based guidance to the citrus grove environment where the

foliage caused large variations in the data about the path boundary, obtained from the ladar.










algorithm starts sending the path error information to the microcontroller as soon as either vision

or ladar algorithm is started up in the dialog box menu. The low level control program in the

microcontroller has to be started using the custom C++ compiler provided with the

microcontroller. As soon as the low level controller is turned on, it starts to poll for error from

the PC and send steer commands to the electro-hydraulic valve. The voltage line between the

microcontroller and the electro-hydraulic valve also has a hardware switch. It should be noted

that the tractor is switched to autonomous mode only when all the required algorithms have been

started up. Stopping the programs is also performed using the dialog box.

PID Based Steering Control

The development of the PID based steering control system was discussed in Chapter 3. The

steering control was implemented in the Tern Microcontroller. The software used was the custom

C++ development environment provided with the microcontroller. The microcontroller was

interfaced with the PC through RS-232 serial communication. The input received by the PID

controller was the lateral position error of the vehicle in the path. The output of the controller is

the voltage sent from the Analog to digital converter in the microcontroller. This voltage in turn

is sent to the electro-hydraulic valve through the amplifier. The important sections of this code

are presented in Appendix A. The flow of the program is as follows:

1. The microcontroller is first initialized. Then, the initialization is performed for the data, the
decoder, the serial communication from the PC, the analog output from the Analog to digital
card, and the PID control variables. The steering is also centered initially.

2. An infinite while loop is started. The exit from the while loop is based on receiving serial
command from the PC.

3. The initial decoder value is obtained corresponding to the center of the steering.

4. Serial data is obtained from the high level vision or ladar algorithms running in the PC. The
data is obtained in the form of four consecutive characters along with a start character.





















-16



Distance (m)
Figure 6-6. Simulation result of fusion of error obtained from vision and ladar algorithms

Experimental Procedure

Typical citrus grove alleyways are practically straight rows. It was felt that testing the

vehicle on curved paths at different speeds would be a more rigorous test of the guidance

system's performance than directly testing in straight grove alleyways. This test represents the

ability of the guidance system to navigate more difficult conditions than the ones likely to be

encountered in the grove. For this purpose, a flexible wall track was constructed using hay bales

to form an S-shaped test path. This shape provides both straight and curved conditions. The path

width varied from 3m to 4.5m and the total path length was approximately 53m. A gap of

approximately Im was left between successive hay bales. The gaps mimic missing trees in a

citrus grove alleyway. The profile of the hay bale track is shown in Figure 6-7. The hay bales

provide a physically measurable barrier, which aids in operating the ladar guidance system and

color contrast with grass which is useful for vision based guidance. In addition, when testing a

large tractor, minor losses would occur if the tractor runs over a barrier of hay bales.









tree left sighted=true;


//Check if the inference is true
//Check if 5 succesive points agree with the inference
for(int k= 1;k<= 5;k++)

if(y_dist[i-k]>grnd-50)

tree leftsighted=false;
//di sagree
break;



//If tree found, stop checking left
if(tree_1eft~sighted==true)
//decided

left wall_position=i;
break;




//Scan for right wall

grnd=y_dist[180];tree~right~sighted=false;rih wall_position=180;

for(int i=180;i<361-10;i++)

if(y_dist[i]
//seems to have found tree
tree right~sighted=true;

//Check if the inference is true
for(int k= 1;k<= 5;k++)

if(y_di st[i+k] >grnd-20)

tree right~sighted=fal se;
//di sagree
break;


//If tree found, stop checking right









automatic agriculture using autonomous off-road vehicles. The use of robots is more common

today than ever before and it is no longer exclusively used by the heavy production industries.

The agricultural industry is behind other industries in using robots because the sort of jobs

involved in agriculture are not straight forward and many repetitive tasks are not exactly the

same every time. In most cases, many factors have to be considered (i.e., size and color of the

fruit to be picked) before the commencement of a task. Robotics is usually associated with the

manufacturing industry. However certain similarities between the manufacturing industry and

agriculture make robotics a good option for agriculture. These factors include decreasing labor,

foreign competition, rising costs, hazardous environment for the laborer and the need to invest in

technology to improve productivity. On the other hand, there are quite a few differences that

make robotics difficult to be used in agriculture. These factors include the variability in the

environment, seasonal and changing crops. Robotics is currently being tested in several areas of

agriculture, for example, picking fruits in orchards, spraying crops in greenhouses, animal

milking and shearing, planting and harvesting crops, food packaging, disease detection, mapping,

etc. Fruit picking robots and sheep shearing robots are designed to replace difficult human labor.









A LIE B

Figure 1-1. Agricultural Robots. A) Fruit picking robot (University of Florida). B) Sheep
shearing robot (University of Western Australia).

In the 2001-2002 season, about 6000 ha of Florida' s citrus groves were mechanically

harvested (Brown, 2002). The operations performed in a citrus grove include preparing the soil,









Eitter[left] [h]=0;
start=0;fini sh=0;


// clear left fitting


for(int j =MyWi dth/2;j >=0;j --)//mi ddle to left end
{
u = ((h*MyWidth)+j)*3;


//if you see white
if(start==0 && *(ProcDdb+u+blu)


=255)


start=J ;


// if you see black after seeing white
if(start!=0 && *(ProcDdb+u+blu)==0 && Einish

Eini sh=j 1;
fitter[left] [h]=finish;
if(left~startx==-1)
{left~startx=f inish;1eft~starty=h; }
if(left~endx!=-1)
{left~endx=-1; }


if(left~starty !


=-1 && fitter[left][h]==0 && left~endx==-1)
{ leftendx=f itter[l eft] [h-1];1eftendy=h-1; }


//----Right side wall---------------------------------

for(int h=MyHeight/2-1 00;h<=MyHeight-100O;h++)


fitter[right] [h] =MyWi dth; // clear right fitting
start=0;fini sh=MyWidth;
for(int j =MyWi dth/2;j <=MyWi dth;j ++)


u = ((h*MyWidth)+j)*3;
if(start==0 && *(ProcDdb+u+blu)


=255)


start=J ;


if(start!=0 && *(ProcDdb+u+blu)==0 && finish==MyWidth)











8-3 Path taken for translation of the vehicle to the way point..........._._... ........___..........252

8-4 Ladar data when obstacle is not present in front of the vehicle ........._._._..........._.......254

8-5 Ladar scan data with obstacle present in front of the vehicle..........._.._._ ........_._._.....254

8-6 Illustration of the obstacle detection using ladar ........._... ...... .___ ......._.._.......255

8-7 Flowchart of the obstacle detection algorithm ...._._._._ .... ... ..... ... .._._._........25

8-8 Waypoint location and vehicle's path for the GPS based navigation..............................25










the vehicle. For alleyway navigation, the error of the vehicle from the path center is 'q' at the

look-ahead distance p For non-alleyway navigation, 'q' can be the required lateral movement of

the vehicle at the look-ahead distance. This value of 'q' is obtained from the machine vision and

ladar algorithms for path segmentation. Using these values of 'p' and 'q', R can be calculated

from the above equation. R is the required radius of curvature of the vehicle to get the vehicle to

the required point. In the CAN based steering system of the e-Gator, steering has to be specified

as steering angles and not as radius. Therefore, R has to be converted to steering angle.

A calibration experiment was conducted to find a relationship between the radius of

curvature and the steering angle. In the experiment, the eGator was taken to an open area.

Steering angle was kept constant by sending CAN commands to the steering controller. The

vehicle was moved forward by manually stepping on the accelerator. For a constant steering

angle, the vehicle moved in a constant circle. Markers were placed on the ground at the inner

wheel radius at different locations. The diameter of the circle was manually measured using a

tape measure. The steering angle was varied to obtain different radius of curvature. The

illustration of the experiment is shown in the Figure 3-66 and the experimental results are shown

in the Table 3-6.





Vehicle






Figure 3-66. Illustration of experiment for steering angle calibration with radius of curvature




































Time update
1) x(k+1)=A x(k)
2) P(k+1)= A P(k) AT + Q


of the error in heading, denoted by a,, was determined from vision algorithm, the variance of

the error in heading, denoted by "0,,,,,,was determined from the IMU measurements and the

variance of the error in velocity denoted by a,, was determined from the speed sensor

measurements. xc and 6 c are measurements from the same camera. However, the vision

algorithm uses different regions of the image to estimate them. XC, XL, B IMU and v are

measurements from different sensors operating independent of each other. These measurements

were assumed to be statistically independent. Therefore, the different sensor measurements were

assumed to be statistically independent of each other. Hence the covariance values are zero. The

variance of the velocity measurements was zero. This can be attributed to the low resolution

(1mph) of the speed sensor.

Predict


Measurement update
1) G(k)=P(k+1)H' (H P(k) H' + R)~
2) x(k+1)= x(k) +G(k)(Z(k)-H x(k))


) 3) P(k+1)=(I-G(k) H) P(k+1)|

Initial
x(k),P(k), QUpdt
Figure 6-3. Kalman filter operation

Figure 6-3 shows the operation of the Kalman filter. The fi1ter estimates the process state x

at some time k+1 and estimates the covariance P of the error in the estimate as shown in the time

update block. The fi1ter then obtains the feedback from the measurement z. Using the fi1ter gain

G and the measurement z, it updates the state x and the error covariance P as shown in the

measurement update block. This process is repeated as new measurements come in and the error

in estimation is continuously reduced.









eGator. Pure Pursuit steering control algorithm was adopted for steering the eGator. The method

does not require the knowledge of the vehicle dynamics. It was also intuitive to develop.

After navigating the alleyway of a citrus grove, the vehicle should be able to enter

subsequent alleyways by turning at the headlands. This would provide the vehicle with the

ability to completely navigate the grove. Algorithms were developed to turn the vehicle at the

headlands of the grove. The algorithm used machine vision for detecting the approach of the

headland and sweeping ladar to accurately determine the location of the headland. U-turn and

switch- back turning algorithms were developed to turn at the headlands. Experiments were

conducted to verify the ability to tumn at the headlands. The switch-back turn is capable of

turning within a smaller area but took slightly more time to execute. The headland turning ability

allowed the vehicle to completely navigate a grove block.

Often large citrus groves are separated into smaller grove blocks. An autonomous vehicle

capable of moving from one grove block to another would be an added capability. The area

between the grove blocks is expected to be an open Hield or roadway. GPS based open Hield

navigation algorithm were developed. The algorithm used latitude and longitude coordinates of

way points to navigate the vehicle. In open Hields, obstacles are expected. Therefore, obstacle

detection algorithms using ladar was developed. The vehicle possesses the ability to stop in the

presence of obstacles. Experiments were conducted to verify the ability of the vehicle to navigate

open fields using way points and to test the obstacle detection capability. Open Hield navigation

together with obstacle detection enables the vehicle to traverse the area between one grove block

to another. The vehicle now possesses the ability to completely navigate a citrus grove.









headlandflag=false;
pathend=false;
estr7h[0]=100;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);


else //regular

i flyaw>(he adl andangle- 1 50))

estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=202;
Gator.Write((BYTE *)estr7h,1);

else

headlandflag=false;
pathend=false;
estr7h[0]=100;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);


} //rev over


estr6h[0]='m'; //end of transmission data
Gator.Write((BYTE *)estr6h,1);

else if(headlandmodel==-2)//right turn

estr6h[0]='k'; // start of transmission data
Gator.Write((BYTE *)estr6h,1);

//rit turn---------------------------------------------
//turn rit till yaw>yaw+90
if(correctionl!=0 && flagl==false) //extremes

if(delayfl ag l==false)
{ Sleep(3000);de layoflagl=true; }
OutputDebug String(" cor rit 90\n ");
if(yaw<(correctionl1-20) ||I yaw>270)









As directly observed from the simulations, the reduction of noise also significantly played

a role in the improvement of performance. This is in agreement with what is reported in Han et

al. (2002), that use of Kalman filter reduced the noise and improved tracking performance for a

DGPS based parallel tracking. Kobayashi et al. (1998) report high accuracy by using a fuzzy

logic Kalman filter based fusion method using DGPS, in accordance with the results reported in

this paper. Sasladek and Wong (1999) used fuzzy logic for tuning the Kalman filter with DGPS

receiver as the primary sensor and report that the simulation results showed prevention of

divergence and improved accuracy in position and velocity. Comparatively, this research used

vision and ladar as sensors and prevented divergence using fuzzy logic for tuning. In addition,

fuzzy logic was also used for picking the more accurate sensor and improved performance was

experimentally verified for a citrus grove application. Compared to many such similar research

reported in literature, this research has significant improvements and differences. Whereas many

researches use DGPS receiver as the primary sensor for their application, this research uses

vision and ladar as primary sensors for guidance as well as for fusion. Secondly, this research

uses fuzzy logic for tuning or divergence correction as well as for adaptively choosing what

sensor is to be relied upon for fusion at different locations in the path whereas most previous

research uses fuzzy logic exclusively for tuning or divergence detection. This method allows for

switching between important sensors useful in specific areas. Further, previous researches in

agriculture have mainly been for crop applications, whereas this research is for citrus grove

operations.









where Dc is the current distance and de is the change in the distance measured by the

encoder.

Also, when the headland is detected using the machine vision algorithm, the priority of

ladar in the sensor fusion process is increased and the priority for vision is reduced to zero. This

is because, close to the headland, the image from the camera primarily consists of headland

obj ects. This is also due to the long field of view of the camera. The vision algorithm

development was based on segmenting trees and path. Close to headlands, this segmentation is

not feasible as the alleyway trees are out of the camera' s field of view. On the other hand, ladar

with the short field of view is able to detect the last few trees in the alleyway and perform the

navigation. Therefore, navigation is switched entirely to ladar.

Precisely establishing the end of the row

As mentioned earlier, the vision based headland detection gives an estimate of where the

headland is located and the dead reckoning allows the vehicle to get close to the headland. A

midsize citrus tree with some leafy canopy has a width of a minimum of 2m. The estimated

distance using vision followed by dead reckoning allows the vehicle to get to a position such that

the last tree is to the side of the vehicle. Once the dead reckoning is completed and the vehicle

has reached the estimated distance near the last tree, the vehicle is moved further forward while

the ladar data is monitored. This progress is continued until the ladar algorithm does not detect

any trees on either side. When this happens, the vehicle is assumed to have crossed the last tree.

This ladar monitoring ensures that the vehicle crosses the last tree of the alleyway.

Drawback of the vision based headland detection

In the vision based headland approach detection algorithm described earlier, the

assumption was that the headland consists either of uniform headland trees or open space. The










for(int c= 0; c<5; c++)
mt. mValue [r] [c]=mH. mValue[c] [r];
//mmtemp2=m temp l~mt; //Asymmetric matrix multiplication
for(int r=0;r<5;r++)
for(int c=0;c<5;c++)

m temp2. mValue[r] [c]=0;
for(int i=0;i<3;i++)

m temp2. mValue[r] [c]+=m temp1. mValue [r] [i]*mt. mValue[i] [c];


m temp2=m temp2+mR;
m temp2=m temp2^'-1;

//mtemp3=mPe~mt;
for(int r=0;r<3;r++)
for(int c=0;c<5;c++)

m temp3. mValue[r] [c]=0;
for(int i=0;i<3;i++)
m temp3. mValue [r] [c]+=mPe. m_Value[r] [i]*mt. mValue [i] [c];


mK-mtemp3*m tmp2;

//X=Xe+K*(Z-H*Xe);
m temp4=mHmXe;
m temp4=mZ-m~tmp4;
m temp5=mK~mtmp4;
mX=mXe+m~temp5;

//P= (I-K*H)*Pe;
m temp=mKmH;
m temp=mI-m temp;
mP=m~temp~m Pe;

ErrorF=-mX.mValue[0] [0]; // New fusion based error
// Innovation vector
//m Z' = m Z -m H~m X
m temp6 = mH mX;
mZ' = mZ m~temp6;
//Update process noise
mQ[0][0] = f x;
mQ[2][2] = f thetaR;









k=k+1;
turn= meanlocationy [m]+5 5;
break;



//turn
if(turn==0)
{turn=1; }

//delete large arrays
delete location;
delete meanlocationx;
delete meanlocationy;


// U-turn at headland
// Input: yaw angle
// Output: steering angle in pixel coordinates
void HeadlandU()

//headland algorithm is taking over from vision
if(MchnVsnAct)
{McrhnT~sn ~ct=fallse;}

if(headl andangle==40 0)

OutputDebugString("headland\n");
headlandangle=yaw;
correctionl=0;
correction2=0;
flaglI-false;
fl ag2=false;
delayflag=false;

//correcting headlandangle for anomalies
//due to pyr sensor having range 0 to 360
//causing addition and subtraction problems
//left turn
if(headlandmodel==1 && headlandangle>=335)
{corprectin1 = (headlonlandngle+3_25)-360;
else if(headlandmodel==1 && headlandangle<200)
{ correction = 360+(headlandangle-200); }
//right turns
else if(headlandmodel==2 && headlandangle<15)
{ corrections = 360-(1 5-headlandangle); }









Gator.Write((BYTE *)estr7h,1);

else //u turn completed
{ //get out to alleywaymode
OutputDebugString(" cor exit");
pathend=fal se;
headlandflag=false;
estr7h[0]=100;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);


else //regular

//OutputDebugString("regular left\n");
if(yaw>(headlandangle+20) && flagl==false)

OutputDebug String(" l\n");
flagli-true;

else if(flagl==true && yaw>(headlandangle-150)) //

OutputDebug String("reg left\n");
estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=202;
Gator.Write((BYTE *)estr7h,1);

else if(yaw<(headlandangle-150)) //<(headl andangle-200)
//u turn completed
{ //get out to alleywaymode
OutputDebugString("reg exit");
headlandflag=false;
pathend=fal se;
estr7h[0]=100;
Gator.Write((BYTE *)estr7h,1);
estr7h[0]=0;
Gator.Write((BYTE *)estr7h,1);
flagli-false;




estr6h[0]='m'; //end of transmission data
Gator.Write((BYTE *)estr6h,1);










variance of the velocity measurements was zero. This can be attributed to the low resolution

(1mph) of the speed sensor. Using the above parameters, it is possible to implement the

measurement update equations.

Reliability factor of primary guidance sensors in the Kalman filter and fuzzy logic sensor
supervisor

Reliability of the sensors. A problem in using a simple Kalman filter in our research is

that the reliability of the information from the sensors depends on the type of path in the grove.

As discussed previously, ladar is accurate at short range for the given mounting arrangement and

machine vision is good at providing overall heading of the vehicle. By experimentation in a

citrus grove, the following observations were made: tree foliage is highly variable, trees can be

absent on either or both sides of the path and some trees can be small enough for ladar to not

recognize them as trees. Therefore, if a white noise model is assumed for the measurements, the

reliability factor of a sensor in the Kalman filter has to be constantly updated. In such a variable

path condition, it is not feasible to have constant noise values for the ladar and vision i.e. have

constant measurement noise covariance matrix R. The covariance matrix R tells the Kalman

filter about how believable or true the sensor measurements are. If the covariance value of a

measurement is low, the filter believes the measurements to be accurate or reliable and if high,

the filter believes the measurements to be erroneous. This type of update can be used to update

the filter on the nature of the sensor at different locations of the grove. For example, when the

ladar does not recognize trees on either side, guidance is more reliable when the measurement

noise for the ladar is increased to a high value such that vision takes over as the reliable guidance

sensor and for example when the vehicle gets to the end of a tree row, vision is not useful

anymore as the citrus trees are absent from the field of view of the camera and so ladar can be

made the sole guidance sensor to cross the last tree by increasing the measurement noise










capability was included. The system requires the obstacle to be moved from the path for the

vehicle to continue navigation. In autonomous operation, movement of obstacles such as animals

cannot be expected. Obstacle avoidance ability could be incorporated to navigate around the

obstacle. The avoidance algorithm may be required to account for moving obstacles such as

moving animals.

The variations in the planting of trees differ from grove to grove. The headlands also

differ. For example, one particular grove where the autonomous vehicle was tested consisted of

deep pits at the headlands for drainage. The headland turning algorithms developed in this work

were not capable of maneuvering the pits and the navigation failed. Therefore, all the varying

grove conditions need to be captured for improving the algorithms. Rigorous testing in several

groves is also required for ensuring reliability.



























Figure 2-2. Sensor arm configurations used to guide a crawler tractor (Yekuteli et al., 2002)

Overhead Guidance

In this method, an overhead guide or a track guides the vehicle using a mechanical link.

Shin et al. (2002) used overhead rails to direct a vehicle through an orchard. The vehicle was

track type and was used for spraying applications.

The vehicle's performance was acceptable in straight and curved paths, but the vehicle

over-steered at higher speeds. This system was only feasible for low speeds of about 0.4 m/s.

Also the installation cost of overhead rails is very high. For a grower to adopt a new technology,

the high cost of constructing large structures is very discouraging. Hence such a system is not

suitable for this proj ect.

Overhead Guide











Orchard
Sprayer



Figure 2-3. Overhead Guidance (Shin et al., 2002)









Conclusions

GPS based navigation method was developed to navigate open fields between grove

blocks. The method used way points to navigate the vehicle in small distances. A DGPS receiver

was used to collect way point coordinates. The way points were provided as latitude and

longitude coordinates. The coordinates were used to steer the vehicle to each of the way points.

Experiments were conducted to test the performance of the open field navigation system. The

vehicle was able to reach all of the way points with an average error of 3 1.75cm from the way

point location. This accuracy was comparable to the accuracy of the DGPS receiver. An obstacle

detection algorithm was developed to detect obstacles while navigating open fields. The

algorithm used the ladar for detecting obstacles. Tests were conducted to verify the obstacle

detection ability. The algorithm was successfully able to detect human and box obstacles. The

algorithm is expected to detect obstacles of the size of cows and animals of similar size.




Full Text
xml version 1.0 encoding UTF-8
REPORT xmlns http:www.fcla.edudlsmddaitss xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.fcla.edudlsmddaitssdaitssReport.xsd
INGEST IEID E97W4OKAP_NK9KGU INGEST_TIME 2011-04-22T20:00:59Z PACKAGE UFE0022364_00001
AGREEMENT_INFO ACCOUNT UF PROJECT UFDC
FILES



PAGE 1

AUTONOMOUS VEHICLE GUIDANCE FOR CITRUS GROVE NAVIGATION By VIJAY SUBRAMANIAN A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF DOCTOR OF PHILOSOPHY UNIVERSITY OF FLORIDA 2008 1

PAGE 2

2008 Vijay Subramanian 2

PAGE 3

To Mankind 3

PAGE 4

ACKNOWLEDGMENTS I thank my advisor and dissertation chair Dr. Thomas Burks for helping me successfully complete all my graduate studies at the University of Florida. His wisdom and advice has helped me sail smoothly through the graduate studies at UF. I am thankful to Dr. Arroyo, Dr. Lee, Dr. Dixon, Dr. Slaton and Dr. Moskow for being a part of my dissertation committee. Their ideas and insights have been valuable. I am grateful to Dr. Arroyo for bringing robotics from the books to my hands through his courses. I thank Greg Pugh for his help in building the electronic boards and in writing the device drivers. But for him, this work would have taken a lot more time to complete. I thank Mike Zingaro for building the pretty mounts on the vehicle and patiently helping collect data in the field during the summer months in Florida. I thank the graduate committee of the Department of Agricultural and Biological Engineering for providing the flexibility in taking courses from other departments. This has been very helpful for me in taking many courses that I like in other departments. I am thankful to the graduate students and alumni of this department for sharing their insights into getting through graduate research. I thank all the people who have influenced this work. 4

PAGE 5

TABLE OF CONTENTS page ACKNOWLEDGMENTS...............................................................................................................4 LIST OF TABLES.........................................................................................................................10 LIST OF FIGURES.......................................................................................................................11 ABSTRACT...................................................................................................................................18 CHAPTER 1 INTRODUCTION..................................................................................................................20 Florida Citrus..........................................................................................................................20 Automation and Robotics in Citrus Groves............................................................................20 Autonomous Vehicles for Citrus Groves................................................................................23 Objectives...............................................................................................................................25 2 LITERATURE REVIEW.......................................................................................................26 Agricultural Autonomous Navigation Applications...............................................................26 Guidance Vehicle............................................................................................................26 Guidance by Remote Control..........................................................................................27 Direct Guidance...............................................................................................................28 Guidance Using Arm.......................................................................................................28 Overhead Guidance.........................................................................................................29 Guidance on Tracks and Leader Cables..........................................................................30 Distance Metering and Triangulation Based Guidance...................................................30 Guidance by Dead Reckoning.........................................................................................31 Guidance Using Ultrasonic Sensors................................................................................31 Wireless/Sensor Networks...............................................................................................32 Guidance Using Inertial Sensors.....................................................................................32 GPS Based Guidance Systems........................................................................................33 Machine Vision for Autonomous Navigation.........................................................................34 Laser Radar (Ladar) for Autonomous Navigation..................................................................37 Intelligence for Autonomous Navigation...............................................................................38 Vehicle Dynamics Modeling and Steering Control................................................................39 Vehicle Modeling............................................................................................................39 Steering Control...............................................................................................................41 PLC control..............................................................................................................42 ON/OFF control.......................................................................................................42 PID control...............................................................................................................42 Adaptive control.......................................................................................................43 Fuzzy logic...............................................................................................................43 Neural networks.......................................................................................................44 5

PAGE 6

Behavior-based control............................................................................................44 Optimal control........................................................................................................44 Non-linear control....................................................................................................45 Sensor Fusion Applications in Navigation.............................................................................45 Pure Pursuit Steering Control.................................................................................................47 Headland Turning Applications..............................................................................................49 Path Planning..........................................................................................................................50 Current Commercial Systems.................................................................................................51 3 EXPERIMENTAL METHODS.............................................................................................52 Sensors, Control and Hydraulic Architecture of the Tractor..................................................52 Guidance System Hardware Architecture.......................................................................52 Vehicle.............................................................................................................................53 Autonomous Steering......................................................................................................55 Servo Valve.....................................................................................................................58 Guidance Sensor: Camera...............................................................................................61 Frame Grabber Board......................................................................................................62 Guidance Sensor: Laser Radar........................................................................................63 Encoder............................................................................................................................64 Computer.........................................................................................................................66 Microcontroller................................................................................................................67 Amplifier Circuit.............................................................................................................68 DGPS Receiver................................................................................................................69 Power Supply...................................................................................................................69 Inertial Measurement Unit...............................................................................................70 Speed Sensor...................................................................................................................71 Experimental Approaches and Theoretical Development......................................................71 Vehicle Dynamics, Calibration and Steering Control of Tractor....................................71 Valve calibration......................................................................................................71 Encoder calibration..................................................................................................73 Vehicle dynamics.....................................................................................................74 Control system..........................................................................................................79 Vision and Ladar Independent Steering..........................................................................83 Vision for path with hay bales..................................................................................85 Algorithm flowchart.................................................................................................91 Vision for citrus grove..............................................................................................92 Laser radar (ladar) for path detection.....................................................................101 Serial communication between PC and microcontroller........................................104 Operating speed of the guidance system................................................................105 Sensor Fusion Based Autonomous Steering..................................................................106 Performance of independent sensor based autonomous steering...........................106 Need for sensor fusion based autonomous steering...............................................107 Kalman filter for autonomous guidance.................................................................108 Reliability factor of primary guidance sensors in the Kalman filter and fuzzy logic sensor supervisor.......................................................................................115 Divergence detection and fuzzy logic correction...................................................122 6

PAGE 7

Sample simulation result of the sensor fusion method...........................................126 Headland Turning and Navigation................................................................................128 Pure pursuit steering...............................................................................................129 Headland turning maneuvers..................................................................................134 Open Field Navigation and Obstacle Detection............................................................147 Open field navigation.............................................................................................148 Obstacle detection..................................................................................................154 4 SOFTWARE DEVELOPMENT..........................................................................................159 Introduction...........................................................................................................................159 Software Architecture for Tractor Guidance........................................................................159 PID Based Steering Control..................................................................................................161 Vision and Ladar Based Path Segmentation.........................................................................162 Sensor Fusion Based Navigation..........................................................................................164 Software Architecture for eGator Guidance.........................................................................165 Pure Pursuit Steering............................................................................................................167 Headland Turning.................................................................................................................167 Open Field Navigation and Obstacle Detection...................................................................169 Open Field Navigation..................................................................................................169 Obstacle Detection.........................................................................................................170 5 DEVELOPMENT OF MACHINE VISION AND LASER RADAR BASED AUTONOMOUS VEHICLE GUIDANCE SYSTEMS FOR CITRUS GROVE NAVIGATION.....................................................................................................................171 Introduction...........................................................................................................................171 Materials and Methods.........................................................................................................174 Experimental Procedure........................................................................................................183 Results and Discussion.........................................................................................................186 Conclusions...........................................................................................................................191 6 SENSOR FUSION USING FUZZY LOGIC ENHANCED KALMAN FILTER FOR AUTONOMOUS VEHICLE GUIDANCE..........................................................................192 Introduction...........................................................................................................................192 Materials and Methods.........................................................................................................196 Kalman Filter.................................................................................................................198 State transition model.............................................................................................198 Measurement model...............................................................................................200 Filter gain...............................................................................................................201 Reliability Factor of Primary Guidance Sensors in the Kalman Filter and Fuzzy Logic Sensor Supervisor............................................................................................203 Divergence Detection and Fuzzy Logic Correction......................................................205 Simulation......................................................................................................................208 Experimental Procedure........................................................................................................209 Results and Discussion.........................................................................................................212 7

PAGE 8

Conclusions...........................................................................................................................216 7 HEADLAND TURNING MANEUVER OF AN AUTONOMOUS VEHICLE NAVIGATING A CITRUS GROVE USING MACHINE VISION AND SWEEPING LADAR.................................................................................................................................218 Introduction...........................................................................................................................218 Materials and Methods.........................................................................................................218 Pure Pursuit Steering.....................................................................................................219 Headland Turning Maneuvers.......................................................................................224 Determining the approach of the headland.............................................................224 Precisely establishing the end of the row...............................................................228 Drawback of the vision based headland detection.................................................228 Precisely establishing end of the row using sweeping ladar..................................229 Vision and ladar based headland determination.....................................................235 Navigating the headland and entering next row.....................................................236 Experimental Procedures......................................................................................................238 Experiments in the Test Track.......................................................................................238 Experiments in the Citrus Grove...................................................................................240 Results and Discussion.........................................................................................................242 Conclusions...........................................................................................................................245 8 OPEN FIELD NAVIGATION AND OBSTACLE DETECTION......................................246 Introduction...........................................................................................................................246 Materials and Methods.........................................................................................................247 Open Field Navigation..................................................................................................247 Obstacle Detection.........................................................................................................253 Experimental Methods..........................................................................................................257 GPS Based Navigation..................................................................................................257 Obstacle Detection.........................................................................................................259 Results and Discussion.........................................................................................................260 GPS Based Navigation..................................................................................................260 Obstacle Detection.........................................................................................................261 Conclusions...........................................................................................................................262 9 RESEARCH SYNTHESIS AND OVERVIEW...................................................................263 10 CONCLUSION AND FUTURE WORK.............................................................................267 Conclusions...........................................................................................................................267 Future Work..........................................................................................................................268 APPENDIX A CODE SEGMENT FOR PID BASED STEERING CONTROL.........................................270 B CODE SEGMENT FOR VISION BASED PATH SEGMENTATION..............................276 8

PAGE 9

C CODE SEGMENT FOR LADAR BASED PATH NAVIGATION....................................284 D CODE SEGMENT FOR SENSOR FUSION BASED ALGORITHMS..............................287 E CODE SEGMENT FOR PURE PURSUIT STEERING......................................................290 F CODE SEGMENT FOR HEADLAND TURNING.............................................................291 G CODE SEGMENT FOR OPEN FIELD NAVIGATION AND OBSTACLE DETECTION........................................................................................................................304 LIST OF REFERENCES.............................................................................................................309 BIOGRAPHICAL SKETCH.......................................................................................................314 9

PAGE 10

LIST OF TABLES Table page 3-1 Specifications of the electro-hydraulic valve.....................................................................59 3-2 Open loop frequency response tests at various speeds and frequencies............................75 3-3 Performance criteria of the controller (Simulated results).................................................83 3-4 Fuzzy logic rule set for sensor supervisor........................................................................119 3-5 Fuzzy logic rule set for divergence correction.................................................................125 3-6 Calibration of radius of curvature of vehicle with steering angle....................................132 5-1 Performance measures of the vehicles guidance system obtained from the experiments conducted in the straight test path...............................................................187 5-2 Performance measures of the vehicles guidance system obtained from the experiments conducted in the curved test path at 3.1m/s.................................................190 6-1 Fuzzy logic rule set for sensor supervisor........................................................................205 6-2 Fuzzy logic rule set for divergence correction.................................................................207 6-4 Performance measures obtained from the experiments conducted in grove alleyway....215 7-1 Calibration of radius of curvature of vehicle with steering angle....................................222 7-2 Performance measures of the eGator based guidance system for path navigation..........242 7-3 U-turn performance measures..........................................................................................244 7-4 Switch back turn performance measures.........................................................................244 8-1 Error of the vehicle for GPS based navigation experiments............................................260 10

PAGE 11

LIST OF FIGURES Figure page 1-1 Agricultural Robots............................................................................................................22 1-2 Typical citrus grove alleyway............................................................................................24 2-1 Remote controlled vehicle (Murakami et al., 2004)..........................................................27 2-2 Sensor arm configurations used to guide a crawler tractor (Yekuteli et al., 2002)............29 2-3 Overhead Guidance (Shin et al., 2002)..............................................................................29 2-4 Greenhouse sprayer with ultrasonic sensor (Singh and Burks. 2002)...............................32 2-5 Machine vision guidance example (Benson et al., 2001)..................................................35 2-6 Machine vision guidance example (Rovira-Mas et al., 2002)...........................................36 2-7 Ladar Mounted on top of the tractor (Yokota et al., 2004)................................................38 2-8 UAV using pure pursuit (Enomoto et al., 2007)................................................................48 2-9 Switch back turning implemented using spline functions (Noguchi et al., 2001).............50 3-1 Architecture of the tractor guidance system......................................................................53 3-2 John Deere 6410 tractor used for autonomous vehicle guidance development.................54 3-3 Hydrostatic steering mechanism of the tractor..................................................................55 3-4 Electro-hydraulic circuit for autonomous steering............................................................57 3-5 Rear end of tractor showing the supply to the servo valve................................................57 3-6 View under the cab showing the servo valve circuit.........................................................58 3-7 Servo valve showing the main parts (Source: Sauer Danfoss PVG 32 valve documentation)..................................................................................................................60 3-8 Video Camera, one of the primary path finding sensors...................................................61 3-9 Camera mounted on top of the tractor cab.........................................................................62 3-10 Frame grabber board..........................................................................................................63 3-11 Laser radar and its mount...................................................................................................64 11

PAGE 12

3-12 Stegmann HD20 Encoder..................................................................................................64 3-13 Output from the encoder shows two waves in quadrature.................................................65 3-14 Encoder mounted on the tractor axle.................................................................................65 3-15 Instruments mounted in the tractor cabin...........................................................................66 3-16 Microcontroller..................................................................................................................67 3-17 Amplifier Circuit................................................................................................................68 3-18 DGPS receiver mounted on top of the tractor cab.............................................................69 3-19 Inverter mounted in the tractor cabin.................................................................................70 3-20 Inertial Measurement Unit used in this research................................................................70 3-21 Voltage and time required to turn 100 degrees..................................................................72 3-22 Encoder calibration............................................................................................................73 3-23 Vehicle position plotted in ArcView (0.2Hz, 10mph).......................................................75 3-24 Measuring gain from the position plot...............................................................................76 3-25 Open loop frequency response for 4 mph, 7 mph and 10 mph..........................................76 3-26 Theoretical model for the three speeds..............................................................................78 3-27 Simulink model for the vehicle dynamics.........................................................................78 3-28 Sinusoidal response of the theoretical model and the vehicle (10 mph, 0.2 Hz)...............79 3-29 Guidance control block diagram........................................................................................79 3-30 Simulink model of the vehicle control system...................................................................80 3-31 Simulated sustained oscillation with a gain of 1.8 for the 10 mph model.........................81 3-32 Simulated step response with the initial tuning parameters (10mph, Kp=1.08, Kd = 1.25)...................................................................................................................................82 3-33 Simulated step response with the final tuned parameters (10mph)...................................83 3-34 Camera and Ladar mounted on top of the tractor cabin.....................................................84 3-35 Typical images seen while navigating through hay bales..................................................85 3-36 Image Coordinates.............................................................................................................86 12

PAGE 13

3-37 R, G, B color plots of hay bales (X axis: Intensity-Range: 0-255, Y axis: no. of pixels).................................................................................................................................87 3-38 R, G, B color plots of grass path (X axis: Intensity-Range: 0-255, Y axis: no. of pixels).................................................................................................................................87 3-39 Thresholded image.............................................................................................................88 3-40 Cleaned image....................................................................................................................88 3-41 Boundary Isolated image...................................................................................................89 3-42 Lines fit to the boundaries..................................................................................................90 3-43 Machine vision algorithm flowchart..................................................................................92 3-44 Typical images for navigation in citrus grove...................................................................93 3-45 Picture collection in the grove...........................................................................................94 3-46 R, G ,B intensity plot for ground and trees in grove alleyway..........................................95 3-47 Images and their green intensity profile across the center of the image............................96 3-48 Thresholding. A)Shadow image........................................................................................97 3-49 Classified images without and with shadows....................................................................98 3-50 Machine vision algorithm flowchart................................................................................100 3-51 Calculating the required heading of the vehicle..............................................................101 3-52 Radial distance plotted against angle (180 degree at 0.5 degree increments).................102 3-53 Ladar algorithm flowchart...............................................................................................104 3-54 Serial Protocol..................................................................................................................105 3-55 Illustration of Kalman filter operation.............................................................................108 3-56 Vehicle in the path with the state vector variables..........................................................110 3-57 Fuzzy logic implementation.............................................................................................116 3-58 Membership function for linguistic variables..................................................................117 3-59 Crisp output membership function...................................................................................120 3-60 Illustration of inference for the example..........................................................................121 13

PAGE 14

3-61 Fuzzy logic implementation to correct divergence..........................................................124 3-62 Input membership function..............................................................................................124 3-63 Output membership function...........................................................................................126 3-64 Simulation result of fusion of error obtained from vision and ladar algorithms.............127 3-65 Pure Pursuit steering method...........................................................................................130 3-66 Illustration of experiment for steering angle calibration with radius of curvature..........131 3-67 Line fitting for calibrating radius of curvature of vehicle with steering angle................132 3-68 Flowchart of Pure Pursuit steering algorithm..................................................................133 3-69 Image obtained from the camera when the vehicle is navigating the alleyway...............134 3-70 Segmented images...........................................................................................................135 3-71 Vision algorithm example for headland detection...........................................................136 3-72 Use of boxes in the segmented images............................................................................137 3-73 Image where the vision based headland detection can fail..............................................139 3-74 Ladar and motor assembly for sweeping.........................................................................140 3-75 Illustration of the ladar sweep..........................................................................................140 3-77 Swept ladar data in Cartesian coordinates.......................................................................143 3-78 3-dimensional ladar data projected on the horizontal plane............................................144 3-79 Tree clustering and headland determination....................................................................145 3-80 Illustration of vehicle navigating way points...................................................................150 3-81 Angle ranges of steering and vehicle turn........................................................................153 3-82 Path taken for translation of the vehicle to the way point................................................154 3-83 Ladar data when obstacle is not present in front of the vehicle.......................................155 3-84 Ladar scan data with obstacle present in front of the vehicle..........................................156 3-85 Illustration of the obstacle detection using ladar.............................................................156 3-86 Flowchart of the obstacle detection algorithm.................................................................158 14

PAGE 15

4-1 Software architecture of the tractor guidance system......................................................160 4-2 Dialog box for high level controls...................................................................................160 4-3 Software architecture of the eGator guidance system......................................................166 5-1 Electro-hydraulic retrofit of the tractor for automatic guidance......................................175 5-2 Guidance system architecture of the vehicle...................................................................176 5-3 Camera and ladar mounted on top of the tractor cab.......................................................176 5-4 Machine vision algorithm flowchart................................................................................178 5-5 Machine vision results for citrus grove alleyway............................................................179 5-6 Ladar algorithm flowchart...............................................................................................180 5-7 Radial distance measured by the laser radar in the hay bale path....................................181 5-8 Open loop theoretical and actual frequency response of the sinusoidal vehicle dynamics test (Phase = -180 deg)....................................................................................182 5-9 Simulated vehicle control system block diagram............................................................183 5-10 Simulated 1m step response of the vehicle at 3.3 m/s.....................................................183 5-11 Guidance system test path................................................................................................184 5-12 Vehicle in the citrus grove alleyway................................................................................186 5-13 Performance of the machine vision guidance in the straight path...................................188 5-14 Performance of the laser radar guidance in the straight path...........................................189 5-15 Performance in the curved path at 3.1 m/s.......................................................................190 6-1 Fusion based guidance system architecture.....................................................................197 6-2 Vehicle in the path with the state vector variables..........................................................198 6-3 Kalman filter operation....................................................................................................202 6-4 Fuzzy logic for sensor supervisor....................................................................................204 6-5 Fuzzy logic for correcting divergence.............................................................................206 6-6 Simulation result of fusion of error obtained from vision and ladar algorithms.............209 6-7 Hay bale track profile.......................................................................................................210 15

PAGE 16

6-8 Citrus grove alleyways.....................................................................................................211 6-9 Field of View (FOV) of camera and ladar.......................................................................211 6-10 Path navigation error of the vehicle in the test track.......................................................214 6-11 Path navigation error of the vehicle in the grove alleyway.............................................216 7-1 Pure Pursuit steering method...........................................................................................220 7-2 Illustration of experiment for steering angle calibration with radius of curvature..........221 7-3 Calibration curve between radiuses of curvature of vehicle with the steering angle.......222 7-4 Flowchart of Pure Pursuit steering algorithm..................................................................223 7-5 Images obtained from the camera when the vehicle is navigating the alleyway.............224 7-6 Segmented images...........................................................................................................225 7-7 Vision algorithm example for headland detection...........................................................226 7-8 Use of boxes in the segmented images............................................................................227 7-9 Sample image where the vision based headland detection can fail.................................229 7-10 Ladar and motor assembly for sweeping.........................................................................230 7-11 Illustration of the ladar sweep..........................................................................................231 7-12 Coordinate systems..........................................................................................................232 7-13 Swept ladar data in Cartesian coordinates.......................................................................233 7-14 3-dimensional ladar data projected on the horizontal plane............................................234 7-15 Tree clustering and headland determination....................................................................235 7-16 Vehicle in the test track....................................................................................................238 7-18 Alleyway of the citrus grove where experiments were conducted..................................240 7-19 Headland turning maneuvers in the grove.......................................................................241 7-20 Headland turning maneuver performance measures........................................................243 8-1 Illustration of vehicle navigating way points...................................................................249 8-2 Angle ranges of steering and vehicle turn........................................................................251 16

PAGE 17

8-3 Path taken for translation of the vehicle to the way point................................................252 8-4 Ladar data when obstacle is not present in front of the vehicle.......................................254 8-5 Ladar scan data with obstacle present in front of the vehicle..........................................254 8-6 Illustration of the obstacle detection using ladar.............................................................255 8-7 Flowchart of the obstacle detection algorithm.................................................................256 8-8 Waypoint location and vehicles path for the GPS based navigation..............................257 17

PAGE 18

Abstract of Dissertation Presented to the Graduate School of the University of Florida in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy AUTONOMOUS VEHICLE GUIDANCE FOR CITRUS GROVE NAVIGATION By Vijay Subramanian August 2008 Chair: Thomas F Burks Major: Agricultural and Biological Engineering Automation and robotics is becoming an important part of agricultural operations in the 21st century. Exploration is currently under way to use robotics in citrus groves. An important part of robotics in citrus groves is the use of autonomous vehicles. The development of guidance systems for an autonomous vehicle operating in citrus groves is presented in this dissertation. The guidance system development is based on two different vehicles, a commercial Tractor and a golf cart type vehicle, eGator. Initial developments were based on the Tractor and the later developments were based on the eGator. The guidance system development process consisted of choosing sensors for implementing the guidance system, the mounting of the sensors and developing hardware and software interface for the sensors with the computer, the development of algorithms for processing the information from the sensors to determine the path for the vehicle, the development of steering control system to steer the vehicle in the desired path; the development of the software for interfacing the sensors and implementing the algorithms, and conducting experiments to validate the operation of the autonomous vehicle in citrus groves. The primary sensors used for determining the path for the vehicle were, video camera and laser radar. The important methods discussed in this dissertation are the determination of the 18

PAGE 19

vehicle dynamics of the tractor; the development of PID based steering control system for steering the tractor machine vision, machine vision and laser radar based path segmentation algorithms for navigating the vehicle through grove alleyways, sensor fusion based path determination by fusing the information from machine vision and laser radar, Pure Pursuit method of steering the eGator, algorithms for turning the vehicle at the headlands of the citrus grove using sweeping ladar, GPS based navigation of open fields between grove blocks and laser radar based obstacle detection. The ability of the autonomous vehicle to navigate the paths was experimentally verified in custom designed test tracks and in citrus groves. The development of the autonomous vehicle guidance system showed machine vision and ladar based systems as promising and accurate methods for navigating citrus groves. The autonomous vehicle possesses the ability to navigate the citrus grove alleyways in the middle of the path, turn around at the headlands and navigate subsequent alleyways, navigate open fields between grove blocks, and detect and stop in the presence of obstacles. 19

PAGE 20

CHAPTER 1 INTRODUCTION This dissertation is about the development of an autonomous vehicle guidance system and methods for navigating citrus groves. The need for automation in citrus groves is first introduced. The objectives of this work in addressing the need are then specified. A glance of the research done by other researchers in similar areas is given. The details of the autonomous vehicle guidance system and algorithm development are discussed. Finally, the results of the research are presented and the areas of further improvement are indicated. Florida Citrus Citrus is a genus of flowering plants. For the common man, it includes orange, tangerine, grapefruit, lime, lemon and their many varieties. Florida, the place where this work was done, supplies more than 80% of citrus in USA and is today a $9 billion industry (Hodges et al., 2001). There are also other major citrus producers in the world such as Brazil, Mexico, Israel and Italy. For all these producers, there are large numbers of people in the world who consume citrus. The market therefore, for citrus, is huge. Citrus area per farm in Florida was around 0.5 sq. Km in 1997 (Hodges et al., 2001) and is currently the states largest agricultural produce. Florida is also the worlds leading producer of grapefruit and is the second largest producer of oranges in the world following Brazil. 95 percent of all Florida oranges are squeezed into orange juice (Brown, 2002). The total economic impact of the citrus industry on the state of Florida exceeds $8 billion annually. Automation and Robotics in Citrus Groves Over the years, large scale farms have been replacing the smaller ones. More people are moving to other occupations. So, one farm owner replaces several owners. This leads to larger fields to plant and harvest. Laborers often work for several weeks in a farm, so facilities have to 20

PAGE 21

be provided for them to live near the orchard. This requires a good investment on the part of the farm owner. In Florida, non-U.S. citizens are often employed for harvesting citrus, many of whom face immigration problems. Therefore labor is scarce and decreasing. If the trend continues, fewer people will be available to operate in the field. With large farms replacing several smaller ones, the work that is required of the operator is also increased in scale. Longer duration of operation is necessary to cover huge orchards. With longer duration of operation, operator work is large and consistency in precision is sacrificed. The amount of work per person is also expected to be more. It is also known that longer duration of operation can be detrimental to the laborers health. For applications such as spraying, there is a risk of the operator being exposed to the sprayed chemicals. This would affect the operators health. Therefore there is a great need for automation of agricultural practices. A general definition of the term automation is according to the Oxford English dictionary, the use of electronic or mechanical devices to replace human labor. For the automation haters, it must be quickly pointed out that automation is not for rendering people unemployed. Automation is mostly to do the work that is not enjoyable or is dangerous or is difficult for humans. For example, the automatic processing and packaging machines which operate in large factories. There are as many varying definitions of robotics as there are researchers in the field of robotics. A reasonable definition would be an automation possessing enough intelligence to change its behavior in accordance with some level of change in its environment. The level of change is generally not defined. Examples range from the well known walking humanoids to the mars rovers. Today robotics technology provides intelligent systems that have the potential to be used in many agricultural applications. Robotic systems are precise and can be used repeatedly. The state of the art technology can provide machines with precision control and intelligence for 21

PAGE 22

automatic agriculture using autonomous off-road vehicles. The use of robots is more common today than ever before and it is no longer exclusively used by the heavy production industries. The agricultural industry is behind other industries in using robots because the sort of jobs involved in agriculture are not straight forward and many repetitive tasks are not exactly the same every time. In most cases, many factors have to be considered (i.e., size and color of the fruit to be picked) before the commencement of a task. Robotics is usually associated with the manufacturing industry. However certain similarities between the manufacturing industry and agriculture make robotics a good option for agriculture. These factors include decreasing labor, foreign competition, rising costs, hazardous environment for the laborer and the need to invest in technology to improve productivity. On the other hand, there are quite a few differences that make robotics difficult to be used in agriculture. These factors include the variability in the environment, seasonal and changing crops. Robotics is currently being tested in several areas of agriculture, for example, picking fruits in orchards, spraying crops in greenhouses, animal milking and shearing, planting and harvesting crops, food packaging, disease detection, mapping, etc. Fruit picking robots and sheep shearing robots are designed to replace difficult human labor. A B Figure 1-1. Agricultural Robots. A) Fruit picking robot (University of Florida). B) Sheep shearing robot (University of Western Australia). In the 2001-2002 season, about 6000 ha of Floridas citrus groves were mechanically harvested (Brown, 2002). The operations performed in a citrus grove include preparing the soil, 22

PAGE 23

planting the trees, irrigation, spraying fertilizers and pesticides, monitoring the grove and trees, and harvesting the fruits. Autonomous Vehicles for Citrus Groves A major achievement in agricultural robotics would be a vehicle capable of moving through the entire field on its own. The human operator would only be required to supervise the autonomous operation. Since these vehicles can work continuously, they are very efficient. These vehicles could open a new market in agriculture. A project on an autonomous vehicle not only benefits agriculture, but also other areas, such as military operations, rescue operations and use in hazardous environments. Since these vehicles are expected to work in challenging environments independent of the operator, they have to rely on their sensors and navigation system. Sensor electronics and computing facilities must be integrated with the mechanical and hydraulic parts. The present technology in sensors and navigational equipments is advanced enough to realize a vehicle capable of autonomous operation. The economic viability has improved significantly over the last decade. Fully autonomous technologies are often seen as being too expensive to justify their use compared to conventional machine systems. This suggests that automation technology should be integrated into conventional machines. The increasing availability of low cost sensors is encouraging. Over the course of time, the fully automated technology could be expected to slowly replace the existing semi-automatic equipment. Tractors are the workhorses of the modern farm. By automating these machines, the productivity can be increased, safety can be improved, and costs could be reduced for many agricultural operations. An article in the Institute of Food and Agricultural Sciences (IFAS) extension website at the University of Florida reports that mechanical harvesting is the future for the citrus industry (Rouse and Futch, 2005). In citrus 23

PAGE 24

groves, trees are planted in rows with alleys between the rows. The usual alley width is about 2.1 to 2.4m and tree heights vary from 4.5 m to 6 m depending on their age (Brown, 2002). Figure 1-2. Typical citrus grove alleyway This research is aimed at developing an autonomous vehicle capable of maneuvering through the alleyways of citrus groves, turn around after reaching the headlands and navigate other alleyways and finally move from one grove block to another. Such a vehicle is expected to be used for robotic applications in citrus groves, such as acting as a guiding vehicle for robotic harvesting carrying several robotic arms, scout vehicle to look for diseased trees, guiding vehicle for spraying nutrients and pesticides, guide vehicle for mowing the grass in the alleyways and carrying sensors for operations such as disease detection. The research aims at adding on sensors to an existing commercially used vehicle and modifying the vehicles steering to be operated autonomously. The cost for integrating the technology to existing machines is expected to be more attractive for a grower to adopt the technology. There are indications that operating at night may also provide benefits. For example, spraying and harvesting operations could be completed at night. However, development of a system operable at night requires research beyond the scope of this dissertation. In the literature, there has been much research in crop harvesting vehicles, but citrus harvesting is a relatively unexplored area in vehicle guidance. To date, there has been minimal success in developing commercial autonomous navigation systems for citrus grove applications. 24

PAGE 25

Objectives The primary objective of this work was to develop an autonomous vehicle guidance system for navigating citrus groves. The following sub-objectives were also identified: Develop machine vision based path segmentation method for navigating the grove alleyways Develop laser radar (ladar) based path segmentation method for navigating the grove alleyways Retrofit the steering system of the vehicle for computer control Interface the sensors and computers with a commercial vehicle for making the vehicle autonomous Develop steering control algorithm for steering the vehicle and develop a mathematical model of the vehicle if required Develop sensor fusion method to fuse the information from different sensors to improve reliability in navigation Develop headland detection algorithm and implement headland turning maneuvers to navigate the grove headlands after navigating each row Develop obstacle detection and halting ability using laser radar and DGPS based open field navigation method to move between citrus grove blocks 25

PAGE 26

CHAPTER 2 LITERATURE REVIEW Agricultural Autonomous Navigation Applications Vehicles have been used in agriculture for many decades. They have been used for applications such as plowing, planting, harvesting and spraying. These operations require the operator to spend long and tedious hours in the field. Modern technology has facilitated the automation of several agricultural operations thereby reducing operator fatigue and improving productivity. An important part of this automation process includes the use of autonomous vehicles. Research has been conducted in developing autonomous vehicles for agriculture for many years. Some of these researches are mentioned below. Many of these researches were conducted in research laboratories and have not yet fully evolved into commercial products. The reasons are numerous ranging from the economics to insufficient computer processing power. However, the advancements in computer processing power and world economics pertaining to agriculture in recent years have given a new hope to bringing autonomous vehicle from research laboratories to commercial agricultural operations. Most of the research into developing autonomous vehicles in agriculture is concentrated on vehicles for crop operations. Research in the development of autonomous vehicles for orchards and citrus groves are few. Guidance Vehicle The autonomous vehicle can either be in the form of guidance system added on to a commercial agricultural vehicle or a vehicle developed exclusively to aid in autonomous or robotic operations. Blackmore et al. (2002) have listed the following requirements of an autonomous vehicle for agricultural applications Small in size Light weight 26

PAGE 27

Exhibit long-term sensible behaviors Capable of receiving instructions and communicating information Capable of coordinating with other machines Capable of working collaboratively with other machines Behave in a safe manner, even when partial system failures occur Carry out a range of useful tasks These requirements are also applicable to vehicles navigating citrus groves. While current research has not progressed to the level of meeting all these criteria, there has been significant progress. In addition to these criteria, the following requirements may also be expected Low cost Attractive for a farmer to adopt Off-road vehicle performance and reliability An autonomous vehicle is expected to possess the following basic components to be operational: vehicle platform, steering system, guidance system, sensors and a control system. Guidance by Remote Control Remote control or tele-operation has already been used in several branches of robotics. Visual information of the operating environment is relayed to a remote location where the operator controls the vehicle. Murakami et al. (2004) used a CCD camera, a gyroscope and a GPS as the sensors and wireless IP protocol for communication, for tele-operation of an HST drive agricultural vehicle. A joystick was used for tele-operation. A B Figure 2-1. Remote controlled vehicle (Murakami et al., 2004). A) Teleoperated Vehicle. B) User interface. 27

PAGE 28

The major advantage of using a tele-operated vehicle would be the comfortable environment of operation and safety. Another minor advantage compared to the direct guidance would be that this is relatively simple and cheap. A major challenge in using tele-operated vehicles is the time delay in communication. For vehicle guidance, no significant advantages are expected. There is neither much reduction in labor cost nor significant reduction in the amount of work that a driver is required to do. There is also limited possibility of automation. Direct Guidance In this method, the vehicle possesses the ability to control its path. This has the potential for automation and is also economically justifiable on large fields. But the technology is complex. The lack of awareness about the technology and safety concerns as compared to remote control is a cause of apprehension to a farmer. However, present day technology and demonstrations of safe control of the guidance could overcome these limitations. Guidance Using Arm Guidance by mechanical contact has been widely used by earlier researchers in agriculture. This method is simple and straightforward. Often, a mechanical arm senses the crop boundary and the vehicle is moved at a distance from the crop. However if a bare region is encountered in the contact environment, the control is lost. Multiple mechanical contacts have been used to overcome this limitation with moderate success. A major concern in using this method is that there is great potential for the contact element to damage the crop. Yekutieli and Pegna. (2002) used a sensor arm to guide a crawler tractor through a vineyard. The arm sensed the rows and determined the position of the tractor. The arm did not cause any damage to the vines. This method was suitable for a controlled environment like a vineyard where the rows were exactly parallel to each other. 28

PAGE 29

Figure 2-2. Sensor arm configurations used to guide a crawler tractor (Yekuteli et al., 2002) Overhead Guidance In this method, an overhead guide or a track guides the vehicle using a mechanical link. Shin et al. (2002) used overhead rails to direct a vehicle through an orchard. The vehicle was track type and was used for spraying applications. The vehicles performance was acceptable in straight and curved paths, but the vehicle over-steered at higher speeds. This system was only feasible for low speeds of about 0.4 m/s. Also the installation cost of overhead rails is very high. For a grower to adopt a new technology, the high cost of constructing large structures is very discouraging. Hence such a system is not suitable for this project. Figure 2-3. Overhead Guidance (Shin et al., 2002) 29

PAGE 30

Guidance on Tracks and Leader Cables This type of guidance is widely used in industrial transportation. It was used in very early researches in agricultural vehicle guidance. In the system of leader cables, the cables carry an AC signal. The coils on the vehicle base detect magnetic fields when the vehicle moves over the cable. The system works very well in orchards. The drawback is the permanent installation of large structures. This also increases the initial cost and these structures require maintenance over time. There is also little flexibility in the crop grown. Fertilizer application and irrigation require care so that they do not damage the tracks. The drawbacks of the mechanical contact methods and the advancement of sensor technology have drawn researchers to the non-contact methods. Distance Metering and Triangulation Based Guidance In this method, the sensors operate based on signals received from sources installed in the environment for guidance purpose. This popular method places markers throughout the field that a range detector can identify. One method employs a photoelectric sensor for the detection of light from reflectors and measured angles between reflectors. The position of the vehicle equipped with the photoelectric sensor is calculated by triangulation. The reflectors are placed in the environment at regular intervals for detection. Another method involves using a laser beam, which the vehicle follows. A laser beam hits a sensor mounted on the vehicle. The beam has to be constantly moved to guide the vehicle. The above methods all require the expensive installation of structures on the field. They also assume that the crops are grown in some order and that they do not block the field of view of the sensor. Rapid development of sensor technology makes it possible to guide vehicles using information from state of the art sensors that require no modification of the environment. These are the popular guidance techniques in the first decade of the 21st century. 30

PAGE 31

Guidance by Dead Reckoning Dead Reckoning is the process of estimating the position of a vehicle by advancing to a known position using course, speed, time and distance to be traveled. This technology is often used in military and marine navigation. The sensors often used are wheel encoders, which measure the rotation of the wheels. Knowledge of the circumference of the wheel gives the distance traveled. Rotary encoders are being used to detect the wheel position for position location (Kodagoda et al.; Nagasaka et al., 2002). Dead reckoning, a widely used method for positioning unmanned vehicles, has not been a very viable option for agricultural applications due to the uneven soil conditions, which facilitate slipping. Often, in wet weather there is sufficient slip in the tractor wheels. The round off errors also increases causing significant incorrect distance measurements. The above two factors introduce errors in calculating distance. Hence, using an encoder as odometer for finding distance has not been reliable. Guidance Using Ultrasonic Sensors Ultrasonic sensors send a high frequency sound wave. The reflected wave is received at the transducer and based on the time of flight, the distance is calculated. Ultrasonic sensors have been very popular for sensing distance and are able to measure tree canopies, while the vehicle travels at a speed of 1.8 m/s with an accuracy of 1 to 3 cm (Iida and Burks, 2002). The accuracy was found to be higher with lower speeds. These sensors have a requirement for the target to be perpendicular to the sensor for the waves to be reflected back properly (Burks et al., 2004). However, when used in groves, the environment is unstructured and so such surfaces cannot be expected. Hence their use is limited. 31

PAGE 32

Ultrasonic Sensors Figure 2-4. Greenhouse sprayer with ultrasonic sensor (Singh and Burks. 2002) Wireless/Sensor Networks Sensor networks are a relatively new development for vehicle positioning. A set of sensors is located at several places along the path and they communicate with the vehicles computer to decide the path to be traveled. This requires permanent positioning of sensors at regular intervals on the field. This is similar to using a beacon or landmark, but is not advisable in an orchard, since the foliage may block the field of view. Wireless LAN has been tried as a method of communication between the vehicle and base station to get feedback on position accuracy (Nagasaka et al., 2002). Microwave systems can perform the same job effectively (Matsuo et al., 2002), but they require frequent calibration. Wireless LAN could be used more effectively for real time path planning. An operator sitting in the office could send information about the next alleyways to be traversed. Guidance Using Inertial Sensors An inertial measurement unit, or IMU, usually consists of six inertial sensorsthree linear accelerometers and three-rate gyros. These sensors combine to give the vehicles pitch, roll and yaw. Often a magnetic compass is also included in the system to locate the direction. Geomagnetic direction sensor (GDS) was the direction tracker when GPS was not available. GDS has given errors due to the high magnetic field present around the vehicle (Mizushima et al, 32

PAGE 33

2002). Therefore proper shielding is necessary. Mizushima et al. (2002) used an adaptive line enhancer (ALE), which is an adaptive filter method, to eliminate noise in GDS. Gyros have been widely used for inclination measurements (Mizushima et al., 2002). Fiber optic gyro (FOG) has been reported to have given the best performance among the different types of gyros for positioning (Nagasaka et al., 2002). IMUs are widely being used for vehicle guidance research at present. With the combination of RTK GPS and FOG, accuracies of up to 5cm have been achieved (Noguchi et al., 2002). Accelerometers and inclinometers (Matsuo et al., 2002) have also been tried with fairly positive results. At present gyros and inclinometers are available together as inertial measurement units (IMU) for pitch, roll and yaw and linear velocity measurements. Inertial measurements are very useful for navigating the vehicle in uneven grounds, as encountered in groves. GPS Based Guidance Systems Guidance using GPS has been the preferred method of many researchers in agriculture for guiding autonomous vehicles on crop rows. Some of the commercial developments have also concentrated on using this method. Global Positioning System (GPS) receivers locate four or more of the GPS satellites and calculate the distance to each, using this information to deduce their own location. GPS, in combination with inertial navigation systems, have been promising positioning sensors. Both Real Time Kinematic (RTK) GPS and Differential GPS (DGPS) have been satisfactory, allowing real time measurement. However there is a tradeoff between accuracy and cost in the selection of DGPS and RTK GPS receivers, with the latter being more accurate but also more expensive. Nagasaka et al. (2002), Benson et al. (2001) and Noguchi et al. (2002) have found that RTK GPS receivers give very accurate results. Accuracies of up to 7 cm with DGPS (Iida and Burks. 2002) and up to 2 cm with RTK GPS receivers have been reported. More accuracy has been observed when the vehicle is stationary or is at slower speeds than at higher 33

PAGE 34

speeds. Stombaugh et al. (1999) have found that the location of a GPS receiver when used as a position sensor is critical for calculating the phase in frequency response tests of the control system. They found that mounting a GPS receiver above the front wheels gives more accurate navigation, whereas most research results report mounting a GPS receiver on top of the cab. GPS based guidance alone cannot be used for positioning in citrus applications, as it could give errors when the vehicle moves under tree canopies, which block the satellite signals to the GPS receiver. In groves, trees could block the satellite signals. Moreover, a system using GPS for guidance requires that a predetermined path be given to the computer for it to follow. This requires initial mapping. Hence, when a GPS receiver is the sole sensor for positioning, it cannot be used instantly in any grove. Significant time has to be spent in mapping its path. Machine Vision for Autonomous Navigation Machine vision is the ability of a computer to "see." A machine-vision system employs one or more video cameras for obtaining images for the computer to interpret. Machine vision is one of the popular sensing technologies for robotics applications today. Machine vision technology can be used to automatically guide a vehicle when the path to be traversed is visually distinguishable from the background. Vision guidance has the advantage of using local features to adjust the vehicle navigation course. The vision systems performance in guiding the vehicle is comparable to highly accurate laser radar (Burks et al., 2004). Typical applications in vehicle guidance include guiding tractors for crop cultivation and guiding an autonomous vehicle in a greenhouse. This sensing system has been found to work well for fully grown crops, but it has not performed well for crop harvesting with low or sparse crops. 34

PAGE 35

A B C Figure 2-5. Machine vision guidance example (Benson et al., 2001). A) Camera mounted on top of the combine cab. B) Crop image. C) Processed image. The vision systems reliability reduces with reduction in lighting, presence of dust and fog. Benson et al. (2001) overcame this problem by using artificial lighting, but they experienced vehicle shadow problem. They could achieve speeds of up to 1.3 m/s with an accuracy of 0.6 cm. Positioning of the camera is also an important factor for the performance of the vision system. Mounting the camera on the top front of the cab of the tractor, pointing downwards is an optimal position for many agricultural applications. NIR filters can be used to process images from a CCD camera (Nishwaki et al., 2002). This sometimes gives better information depending on the application. Vision involves many complicated algorithms for image processing and recognition. It therefore increases the amount of computation required, resulting in dedicated processors being used. However, as faster computers are developed, this concern will no longer prevent them being used for sensing. 35

PAGE 36

A B C Figure 2-6. Machine vision guidance example (Rovira-Mas et al., 2002). A) Camera mounted on the front of the tractor for detecting crop rows. B) Thresholded image. C) After Hough transform. With computer vision, there is always the need for some physical feature or color difference for the vision system to be able to sense effectively. A number of image processing techniques have been investigated to find the guidance course for different applications. Han et al. (2002) developed a row segmentation algorithm based on k-means clustering to segment crop rows. This information was used to guide the vehicle. Stereo vision has been used for navigating through an orchard (Takahashi et al., 2002). This method used two cameras to obtain the stereo information and required considerable time for processing the images. Vision algorithms based on Hough transform and blob analysis has been used for row crop guidance (Rovira-mas et al., 2002) with positive results. Gerrish et al. (1997) used the ratio of the individual primary colors to the sum of the primary colors to segment images and eliminate shadow problems with success. Often hue, saturation and intensity values are used for thresholding (Morimoto et al., 2002), which has worked well in several applications. Computer vision has been used in several 36

PAGE 37

researches. In practice, many problems exist in the real environment in which light conditions change often and contrasts shift, as is often the case in agriculture. Laser Radar (Ladar) for Autonomous Navigation The operating principle of non-contact laser radar is based on time-of-flight measurement. The ladar calculates the distance to the object using the time of flight of pulsed light, i.e. the length of time between sending and receiving the beam of light. An extremely short pulse of light (infrared laser beam) is transmitted towards an object. Part of the light is then reflected back to the unit, a fraction of a second later. A rotating mirror deflects the pulsed light beam to many points in a semi-circle. The precise direction is given by an angular sensor on the mirror in the ladar. A large number of coordinates measured in this manner are put together to form a model of the surrounding area's contours. A planar scanning ladar gives information of one plane. This can be overcome by rotating the laser source to get a 3 dimensional view. Laser range sensors are very accurate in measuring distance, with accuracy as high as 1 cm over a distance of 80 m. This makes them suitable for guidance applications. Laser radar has been used for ranging and obstacle avoidance. It has higher resolution than ultrasonic sensors, so it is more accurate and requires less computation than vision. However its performance degrades with dust and rain, like vision, and it is more expensive than an ultrasonic sensor. Carmer and Peterson. (1996) have discussed the use of laser radar (ladar) for various applications in robotics. Autonomous vehicle navigation was discussed as a promising application for ladar, due to its ability to accurately measure position. Gordon and Holmes. (1998) developed a custom built ladar system to continuously monitor a moving vehicles position. The systems performance was not very reliable due to underdeveloped technology at that time. Current technologies have advanced to the point that ladar systems are readily available, although they are relatively expensive compared to other alternatives. Ahamed et al. 37

PAGE 38

(2004) used a ladar for developing a positioning method using reflectors for infield road navigation. They tested differently shaped reflectors to determine the accuracy in positioning. Ladar can serve the dual purpose of mapping the tree canopies simultaneously along with navigation. This could be used for grove mapping applications. One such application is for a harvesting robot arm. Figure 2-7. Ladar Mounted on top of the tractor (Yokota et al., 2004) Yokota et al. (2004) had a ladar mounted on top of a tractor and generated local maps of the surroundings and integrated it with GPS data. Ladar has been used for navigating a small vehicle through an orchard (Tsubota et al., 2004). Guidance system using the ladar was found to be more stable than using a GPS. Intelligence for Autonomous Navigation Artificial Intelligence (AI) could be useful in solving complex tasks in which the solution is ambiguous or large computations are necessary. A popular use of AI in robotics is the heuristic search, often used in path planning. Heuristics or empirical rules could be used to shorten the search, as the number of solutions could be very large. To reach an optimal level of autonomous operation, the robot should know how to deal with uncertainties in the environment. Robots should be capable of changing their behaviors independently. Artificial intelligence can immensely aid in this endeavor. 38

PAGE 39

A knowledge-based system could be used in assimilating mission information with internal models to supervise navigation (Speigle et al., 1995). A multiple layer path planning combined with a guidance system using artificial intelligence is presented in Ortiz (1993). A combination of global and local navigation schemes is discussed. Voronoi diagrams have been used in this research for navigating different terrain features. Global navigation was used to establish a path from the source to a destination based on fixed objects in the path and other moving vehicles. A local navigation is also used to ward off immediate obstacles and does not take into account the location of the destination. This approach is similar to one used by a human driver and the guidance is expected to perform well. Vehicle Dynamics Modeling and Steering Control Steering control is a major factor for accurate guidance. The controller gets the error information from the sensors and computes or decides the signal to be given to the steering actuator, thereby steering the vehicle to get back to the correct position in the path. Vehicle Modeling Adequate modeling of the vehicle is often required for efficient control. A model of the vehicle dynamics or steering gives the option of conducting simulations of the vehicle while developing or tuning the steering controllers. Sophisticated models covering more details of the vehicles behavior can enable software simulations of the vehicle to be developed for studying the vehicles behaviors for different terrain changes. Models can help understand the stability of the control system being developed. The most common model for describing vehicles is the bicycle model. It is given by the equation 39

PAGE 40

)()(sdsY = )])12(())2()(*[)2()1(sf2222222222CsfCsrLCsfLCsrLmVsCsrLCsfLlmCsrCsfVsmVlzzsCsfCsrLVsLdsCsfCsrLVsIzzdsmLVC (Eq. 2-1) where Csf = Cornering stiffness of front wheels Csr = Cornering stiffness of rear wheels d = Distance of travel ds = Distance between position sensor and center of gravity lzz = Yaw moment of inertia L = Wheel base, L2 = Distance from center of gravity to rear axle L1 = Distance from center of gravity to front axle, along vehicle axis m = Vehicle mass V = Vehicle forward velocity y = Lateral position of vehicle. This is a Single input single output (SISO) model with steering angle input and lateral position output. This model takes into account the tire slip effects. The assumptions made are that the forward speed is constant and that lateral forces are proportional to side slip angle. Alleyne et al. (1997) found that the bicycle model is easier to implement because it is a linear control system. However the model fails at higher speeds due to higher lateral accelerations. Stombaugh et al. (1999) showed that a classical model-based controller could guide a two-wheel-drive tractor to within 16 cm of the desired path at high speeds, as well as that 40

PAGE 41

a double-integrator transfer function can adequately model lateral deviation of the vehicle to steering angle. They also found the dead band of the steering valve to be crucial for control system design. Zhang et al. (2004) used the bicycle model to model the dynamics of the tractor. Kise et al. (2002) modified the bicycle model to account for nonlinearities, including the slip and cornering forces thereby obtaining a non-linear equation. The vehicle was guided using an optimal controller. Qiu et al. (1999) have modeled an electro-hydraulic steering system using single rod double action steering cylinders and other components for a Case-1H Magnum tractor. The model was developed for a wheel type agricultural vehicle. Computer simulations and field tests validated the model. To sidestep the complexity of modeling the vehicle and then designing the control system, some researchers have used non-modeled techniques, for example, neural networks and non-modeled PID control systems. Neural networks have also been used for modeling the vehicle dynamics (Nijhuis et al. (1992). A Neural network has the advantage of not having complex equations for the model. It also handles the non-linearities. However designing traditional control systems is not straightforward for such a model. Steering Control A steering controller is used for controlling the vehicle steering. The input to the controller can be of various forms such as the lateral error of the vehicle in the path, the position of the vehicle at some future instance, the required steering angle or the curvature the vehicle has to follow. The steering controller can take the input and turn the steering to achieve the desired result while continuously monitoring the steering changes. Several types of steering controllers have been used in the literature by various researchers. 41

PAGE 42

PLC control PLC (Programmable Logic Controller) is a highly reliable special-purpose computer widely used in industrial monitoring and control applications. PLCs typically have proprietary programming and networking protocols, and special-purpose digital and analog I/O ports. A PLC uses programmed logic instructions to control banks of inputs and outputs which interface timed switch actuation to external electro-mechanical devices. PLCs are very fast at processing discrete signals (like a switch condition). PLC has been used for steering control (Nagasaka et al., 2002), controlling all the actuators in the vehicle. The PLC control loop took 2ms to execute. However PLC has not received as much attention as the other control systems. ON/OFF control As the name implies, the control signals are turned on and off to achieve the required control. Yekutieli et al. (2002) experimented with an ON/OFF control to guide a crawler tractor through a vineyard. They found this control to work fast, but also observed a lot of overshoot. A much smoother control could be achieved with more sophisticated control systems. PID control The PID (Proportional Integral Derivative) control algorithm is used for the control of almost all loops in the process industries, and is also the basis for many advanced control algorithms and strategies. PID has given satisfactory performance in tractor guidance. Feedforward + PID control worked well for guiding a tractor through crop rows (Kodagoda et al., 2002). Some time delay has been observed when using PID. However, overall, PID control has performed better than proportional and PI control systems. PID control has been used for guiding a grain harvesting tractor (Benson et al., 2003). In that research, PID was used to calculate the actuator command signal based on the heading offset. The performance of the controller was comparable to that of manual steering. 42

PAGE 43

Adaptive control Nonlinearity is observed in electro hydraulic steering valves. PID control is unable to handle nonlinearities effectively. Tsubota et al. (2004) compared the experimental results from a fuzzy adaptive controller and a neural network-based adaptive controller to those from a classical PID controller in terms of velocity tracking. Experiments were conducted on a test bed and it was found that both adaptive control algorithms compensated the non-linearity effectively. Fuzzy logic Fuzzy logic attempts to create artificial boundaries for analog states, which makes it discrete enough for a digital computer to process. The boundaries are made fuzzy so that the controller does not appear discontinuous. The processing stage usually consists of a set of logic rules defined by the designer. Fuzzy control has been used to guide a tractor to follow crop rows (Benson et al., 2001). The results were comparable with PID. Senoo et al. (1992) have pointed out that the fuzzy controller could achieve better tracking performance than the PI controller. They used a vehicle that tracked a magnetic tape on the floor. It has wider adaptability to all kinds of inputs. Qiu et al. (1999) verified that the fuzzy steering control provided a fast and accurate steering rate control on the tractor. Kodagoda et al. (2002) designed fuzzy PD and fuzzy PI controllers to navigate an electric golf cart. These controllers were compared with the traditional PID controllers, and were found to be insensitive to load fluctuations. They determined that fuzzy control was better than PID for longitudinal control. PID was also found to have large chatter and high saturation. A combination of fuzzy and PID control holds a lot of promise (Burks et al., 2004). While designing these controllers, the designer has to anticipate all the situations that the vehicle might encounter. This can be achieved most of the time with a careful design and the experience of the 43

PAGE 44

designer. However, if the transformation between different scenarios is not properly accounted for, the controller could be discontinuous. Neural networks An Artificial Neural Network (ANN) is an information processing paradigm that is inspired by the way biological nervous systems, such as the brain, process information. It is composed of a large number of highly interconnected processing elements (neurons) working in unison to solve specific problems. ANNs, like people, learn by example. They can learn adaptively by the way they are trained. Nijhuis et al. (1992) were successful in using neural networks for collision avoidance. The vehicle used an infrared range sensor to detect obstacles. But neural networks have the inherent disadvantage of learning only what the driver does; hence they are not robust. To work well, they have to be trained for every possible situation that will be encountered. Behavior-based control Behavior-based control is a new development which has been successfully used in small mobile robots. It works on the basis of addressing the most appropriate next action in a particular situation. This type of control is distributed among a set of behaviors, in which each behavior takes care of one aspect of control. Sensors trigger these behaviors. There is the inherent necessity to coordinate different behaviors for proper operation. Behavior-based systems, in combination with real time control system (RTCS), are expected to do well in vehicle guidance. Optimal control Kise et al. (2002) developed an optimal controller to guide a tractor. The tractor used an RTKGPS and an IMU for sensing position. They compared the optimal control with PI control. The optimal controller could perform high-speed guidance more precisely than the PI controller. The performance was acceptable in 90-degree turns also. 44

PAGE 45

Non-linear control A new field of non-linear control is emerging, particularly for controlling mobile robots. Earlier, researchers used linearized models of non-linear phenomenon. Now, with the availability of fast processors, numerical solutions of non-linear equations can be solved quickly. Therefore non-linear control systems are being developed for improved control. However, research is needed before being widely adopted. Sensor Fusion Applications in Navigation Sensor fusion is a major component of sensor integration, merging multiple inputs with a common representation. Multi-sensor environment generates much data with different resolutions. These are often corrupted by a variety of noise, which continually vary because of changes in the environment. Sensor fusion would extract meaningful information from these data to make the optimal decision for navigation. Several methods, such as Dempster Schafer theory and mapping learning methods, have been used for sensor fusion. Kalman filters have been the ideal candidates for sensor fusion and have been widely used. Several methods for fusion have been reported in the literature. These include, but are not limited to, neural networks (Davis and Stentz, 1995; Rasmussen, 2002), variations of Kalman filter (Paul and Wan, 2005), statistical methods (Wu et al., 2002), behavior based methods (Gage and Murphy, 2000), voting, fuzzy logic (Runkler et al., 1998) and combinations of these (Mobus and Kolbe, 2004). The process of fusion might be performed at various levels ranging from sensor level to decision level (Klein, 1999). Rasmussen (2002) used neural network for combining image and laser data to segment road ways from background objects. He fused features such as color, texture and laser in various combinations using neural network to test various fusion combinations. He found that combinations such as these performed better at segmenting roads compared to individual models. Wu et al. (2002) used Dempster Schafer theory for sensor fusion to combine images and sounds 45

PAGE 46

to interpret human behavior. This probabilistic approach was appropriate for such non-deterministic applications. However, it is felt that probabilistic approaches may take away the accuracy of deterministic applications if accurate statistical models of the process are not created. Runkler et al. (1998) thought Kalman filter to be linearising non-linear models and they developed fuzzy models of signals. Then they projected signals on to these models to fuse the data. Their method helped in reducing noise in their experimental data by using this method of fusion. The Kalman filter has been the subject of extensive research in navigation. It is a set of mathematical equations that provides an efficient means to estimate the state of a process, by computation. It can give an accurate estimate of the present, past and future states of the system. It is useful even when the precise model is not available. The Kalman filter is often used in vehicle navigation for processing sensor signals. Information from different sensors is processed using the Kalman filter to compute reliable position information of the vehicle. In many research, the sensor measurements can be fairly noisy. So the choice of method could also help in reducing the noise and also aid in fusing the information from the sensors. Methods such as neural network, behavior based methods; fuzzy logic and voting are not widely used primarily for reducing noise. Kalman filtering is a widely used method for eliminating noisy measurements from sensor data and also for sensor fusion. Kalman filter can be considered as a subset of the statistical methods because of the use of statistical models for noise. Paul and Wan (2005) used two Kalman filters for accurate state estimation and for terrain mapping for navigating a vehicle through unknown environments. The state estimation process fused the information from three onboard sensors to estimate the vehicle location. Simulated results showed feasibility of the method. Han et al. (2002) used a Kalman filter to filter DGPS 46

PAGE 47

(Differential Global Positioning System) data for improving positioning accuracy for parallel tracking applications. The Kalman filter smoothed the data and reduced the cross tracking error. Based on the good results obtained in the previous research for fusion and noise reduction, a Kalman filter was selected as the method to perform the fusion and filter the noise in sensor measurements. The use of a Kalman filter with fixed parameters has drawbacks. Divergence of the estimates is a problem which is sometimes encountered with a Kalman filter, wherein the filter continually tries to fit a wrong process. Divergence may be attributed to system modeling errors, noise variances, ignored bias and computational round off errors (Fitzgerald, 1971). Another problem in using a simple Kalman filter is that the reliability of the information from the sensors may depend on the type of path. Therefore, if a white noise model is assumed for the process and measurements, the reliability of a sensor in the Kalman filter has to be constantly updated. Abdelnour et al. (1993) used fuzzy logic in detecting and correcting the divergence. Sasladek and Wang (1999) used fuzzy logic with an extended Kalman filter to tackle the problem of divergence for an autonomous ground vehicle. The extended Kalman filter reduced the position and velocity error when the filter diverged. The use of fuzzy logic also allowed a lower order state model to be used. Sasladek and Wang (1999) used fuzzy logic to reduce divergence in the Kalman filter and presented good simulation results. Pure Pursuit Steering Control Pure pursuit is a method for steering vehicles (land, sea or air). In this method, the vehicle is deemed to follow or pursue a point in front of the vehicle and therefore steered such that the vehicles nose points to the point being pursued. Pure pursuit is similar to human driving as humans tend to drive towards some imaginary point in front of the vehicle at different instances i.e. pursuing a point. Therefore it is very intuitive to implement in an autonomous vehicle. In this 47

PAGE 48

method, a look-ahead point is chosen and an arc is constructed joining the present vehicle location and the look-ahead point. Then the vehicle is steered to follow the arc. The method has been in use for several years in different guidance systems. Some of the early work in using pure pursuit for autonomous vehicle steering was in the Carnegie Mellon Terragator and NavLab projects (Coulter, 1992). Pure pursuit guidance has been used for unmanned aerial vehicles (UAV) by Enomoto et al. (2007). In their research an UAV chased an aircraft using pure pursuit. Figure 2-8. UAV using pure pursuit (Enomoto et al., 2007) Petrinec et al. (2003) used pure pursuit based path following for autonomous vehicles in a multi vehicle simulator to simulate factory floor shop. They point out that the look-ahead distance is crucial for proper operation of the vehicle while using pure pursuit. Shorter look ahead distances may cause oscillations whereas longer look ahead though smooth may take the vehicle longer time to get back into the path. As they had to deal with varying look ahead distances, they approximated large look ahead distances using piecewise linear smaller look ahead distances. This reduced the lag due to larger look-ahead distance. 48

PAGE 49

Headland Turning Applications Headland in citrus groves is the space available at the end of each tree row. This space is generally used for turning the vehicles navigating the rows. Headlands are characterized by a ground space of at least 4.25m after the last tree in each row. The ground is usually covered with grass. Beyond this space, there may be trees planted to act as boundaries of the grove or there could be more open ground or some combination of both. An autonomous vehicle operating in citrus groves needs to navigate the headlands to turn and navigate other rows. Miller et al. (2004) have reported that that the space required to turn at the headland and the time spent in turning, significantly affect the field efficiency. They also report that damage of crops due to improper headland turn affects productivity. Kise et al. (2002) tested headland turn using spline function path planning method on a tractor using GPS based guidance system. The method guided the vehicle along the desired path with error less than 20cm lateral deviation. Hague et al. used vision based guidance in combination with dead reckoning to guide a vehicle through crop rows. The guidance system detected end of crop rows using absence of crop row information from vision and distance information from dead reckoning. The vehicle was able to execute a u-turn at the headland. Hansen et al. (2005) analyzed headland turns for a combine in a field. They analyzed loop turn or uturn in particular. The turns were executed for entering successive crop crows or for entering some row after skipping a few rows. They also developed a model to represent the travel behavior and verified with experimental headland turns. Using this, the turns were optimized for speed and time spent in turning and also the number of rows to be skipped for optimal turning. Noguchi et al. (2001) implemented switch back turning maneuver for a tractor to turn at the headlands of crops. The turning maneuver was implemented using DGPS guidance and spline function for turning. Precise turns were observed by simulation and experimentation. Oksanen et al. (2004) studied the behavior of tractortrailer systems in 49

PAGE 50

headlands of crops. They tried to solve the systems turning using optimal control and tried both u turns and switch back turns. The behaviors were analyzed by simulation alone and improvements were suggested using numerical methods for improved solutions. Figure 2-9. Switch back turning implemented using spline functions (Noguchi et al., 2001) Path Planning Path planning aids the mobile robot in traveling from the beginning to the final task. During navigation, conditions might be encountered that require the shortest distance to travel, a safe path, etc. Path planning makes it possible to take care of all these options. Motion trajectories for robots are usually obtained by teaching from a human operator or by path planning on a computer. Geographic Information Systems (GIS) is being widely used for mapping (Noguchi et al., 2002) and path planning. Use of teaching-playback robots is thus restricted to controlled environments, like mass-production factories. Traditional path planners normally use heuristics or a performance index for path specification. A Digital image of the route map could be used for path planning (Speigle et al., 1995). The path is segmented prior to the actual navigation and marked for landmarks. Then an optimal path is calculated. Fu et al. 50

PAGE 51

(2004) have developed a path planning algorithm based on Dijkstras algorithm for searching shortest path. The algorithm searches a restricted area and makes use of the spatial feature of the road network. Current Commercial Systems There are several commercial autonomous vehicles available for field applications. Most of these systems use GPS positioning to aid in straight driving. Large farms employ these for large scale spraying and harvesting in crops. These vehicles provide a visual feedback of the vehicles deviation from the GPS guided path. The farmer can get feedback about his driving directions. This method has increased the efficiency of operation in large farms. Most of these systems are limited to straight path guidance in open fields. They are also not suitable for citrus groves since the receiver in a grove does not properly receive satellites signals, which can be blocked by the tree canopy. John Deere, Moline, IL and iRobot Corp., Boston, MA are developing semi-autonomous vehicles for military use, using a John Deere Gator vehicle. John Deere is also developing an autonomous tractor without a cab and steering wheel. They could save a lot of weight by eliminating these two elements. Omnitech Robotics, Englewood, CO, has developed several tele-operated off-road vehicles, including a tractor. The present commercial systems are opening a new path for advanced autonomous vehicles. The farmers employing these systems at present are more likely to be receptive to a fully autonomous system. A lot of research is required in developing a vehicle capable of autonomous behavior in orchards and citrus groves. Present commercial systems often employ GPS as the lone sensor. The use of additional sensors could make these vehicles more autonomous and applicable in several areas ranging from crops to orchards. 51

PAGE 52

CHAPTER 3 EXPERIMENTAL METHODS This chapter describes the hardware architecture and the theoretical development of the various components of the autonomous vehicle guidance system for navigating citrus groves. Sensors, Control and Hydraulic Architecture of the Tractor Guidance System Hardware Architecture The major components of the autonomous navigation system are the sensors for monitoring the environment of the autonomous vehicle, computer for collecting information from the sensors and providing the autonomous operation and actuators for actuating the steering. From the literature, it is observed that machine vision has been very effective in vehicle navigation. With the availability of fast computers, vision is a viable and promising option. For this research, vision system is used to segment the path in the citrus grove alleyway and guide the vehicle. Laser radar is a highly accurate sensor for measuring the distance to objects. Such accuracy cannot always be expected with vision. Laser radar is available for this project from a previous research and has been a good distance measuring sensor in the literature. Therefore it is also used as one of the guidance sensors. A common PC is used for processing the information from the sensors and to implement high level guidance algorithms. A microcontroller is used for performing low level controls and high speed operations which may not be possible using PC. For controlling the steering of the vehicle, a servo valve was used to control the hydraulic steering. For good control of the steering, feedback of the steering angle is essential. A rotary encoder was available in the laboratory during the start of the project, which was used as the steering angle feedback sensor. 52

PAGE 53

The architecture of the vehicle consists of the vehicle and the sensors and actuators interfaced with the PC and the microcontroller. The ladar and the vision sensors are interfaced with the PC whereas the hydraulic valve and the encoder are interfaced with the microcontroller. The PC processes the vision and ladar information. It then sends required vehicle lateral displacement information to the low level controller. The low level controller gets the desired displacement information from the PC, computes and sends the necessary signal for the hydraulic valve using the control algorithm. It also gets steering angle information from the encoder for the control algorithm. Electrohydraulic Steering Figure 3-1. Architecture of the tractor guidance system Vehicle For navigating citrus groves, the autonomous vehicle was expected to be operated in harsh outdoor environments most of the time, so the platform should have good off-road qualities. PC 2.4GHz Laser Radar Microcontroller + Vision Encoder Amplifier 53

PAGE 54

Uneven paths are encountered in a citrus grove, so strong driving motors are also necessary. The capacity of the vehicle power supply should provide for long hours of operation. The vehicle platform is expected to meet these requirements. Adding the new technology to vehicles currently being produced is expected to be less expensive than the cost incurred to design a totally new vehicle. Moreover, a farmer is more likely to adopt an addition to an existing technology rather than a totally new design. Present day tractors already possess the off-road capabilities, so retrofitting an existing tractor seems to be a viable option. The development of the autonomous vehicle guidance system was started in 2003. At that time, a John Deere 6410 tractor was readily available in the field automation laboratory of the Department of Agricultural and Biological Engineering. It was decided that the tractor already possess the off-road ability required for the autonomous vehicle. Significant cost and time could be saved by starting the development on the tractor, rather than custom designing a new vehicle. A tractor has many advantages in that it is a commercially manufactured, reliable, multi-purpose product technically supported. All the mechanical and hydraulic systems conform to recognized standards. The autonomous guidance system can be retrofitted and the human control interfaces could be modified to include actuators. Therefore, development of the guidance system was started on converting the tractor to operate autonomously. Figure 3-2. John Deere 6410 tractor used for autonomous vehicle guidance development 54

PAGE 55

Autonomous Steering The steering mechanism on the tractor uses hydrostatic steering mechanism. The illustration of the steering mechanism is shown below. Hydraulic Fluid Steering Wheel Steering Steering Valve Cylinder Figure 3-3. Hydrostatic steering mechanism of the tractor In this steering mechanism, the human operator turns the steering wheel. The steering wheel is mechanically coupled to the steering valve. The steering valve can be thought of as the direction control valve. When the steering wheel is turned, the steering valve direction control is changed from no steering to flow in one direction or the other. The hydraulic fluid flows through the steering valve and tries to move the piston in the steering cylinder. The steering cylinder can be thought of as a double acting cylinder. Depending on the flow of fluid controlled by the steering valve, the piston in the steering cylinder is moved to the left or to the right. The wheels of the tractor are mechanically coupled to the shaft of the steering cylinder on either side. This motion of the steering cylinder turns the tractor steering wheels to the right or to the left. For the tractor to operate autonomously, it should be capable of steering by receiving electrical signals from a computer. At the same time, it is important to have the ability to switch to manual control whenever necessary. Since most of the hydraulic circuits in the tractor are proprietary information, the major concern was that the existing system should not be tampered with so that manual capability is not lost. Therefore a steering mechanism was developed to make the steering automatic. The illustration of the hydraulic circuit for this conversion is shown 55

PAGE 56

below. In the Figure 3-4, the hydrostatic steering mechanism of the tractor is shown on the left. For autonomous steering capability, an electro-hydraulic servo valve was connected in parallel tothe existing circuit. This is illustrated in the right half of the Figure 3-4. The specifications of the servo valve are given in the next section. The servo valve is similar to the steering valve described above. The difference is that the servo valve takes analog voltage signals to conflow of fluids, instead of the mechanical turning of the steering wheel by the driver. An electrical switch has been connected to the servo valve. The switch when turned on, makes the servo valve operational. Once the servo valve is turned on, it is ready to accept electrical signals from the computer to control the flow of fluid. The direction of fluid flow to the steering changes the steering to the left or right as described for the hydrostatic steering. Since the valves are closecenter, when one valve is in operation, flow of fluid through the other valve is blocked. Further,the existing steering circuit includes check valves which prevent reverse flow of fluid through the valve when it is not operational. It should be noted that, making trol the d the servo valve operational by turning on the switch, transfual rn fluid lines for the servo valve are tapped from the front end loader suppl ers complete control to autonomous steering. Only when the switch is turned off, mansteering is again made operational. Therefore the human driver can choose manual or autonomous as required. The supply and retu y system (Figure 3-5). The Figure 3-6 shows the retrofitted circuit with the servo valve. Asobserved, the output from the servo valve is directed to the steering cylinder using a T joint, along with the existing steering lines. 56

PAGE 57

Figure 3-4. Electro-hydraulic circuit for autonomous steering Figure 3-5. Rear end of tractor showing the supply to the servo valve Hydrostatic Steering Electro-hydraulic Steering Supply for froend nt loader Supply Return 57

PAGE 58

Existing steering lines to cylinder Servo Valve Return Supply New lines to cylinder Figure 3-6. View under the cab showing the servo valve circuit Servo Valve The servo valve described in the previous section is an electrically controlled load sensing servo valve from Sauer-Danfoss. It is a direction control servo valve with important features like Low weight Built in pressure relief 11V-32V supply voltage 3-position, Closed Center The valve is also manually operable using a lever key. 58

PAGE 59

Table 3-1. Specifications of the electro-hydraulic valve Supply Voltage Range (5% ripple) 11V to 32V Current consumption at rated voltage 0.57 A at 12V supply Signal voltage Neutral A port <-> B port 0.5 Supply voltage 0.25 to 0.75 Supply voltage Power consumption 7W Hysterisis at rated voltage for one cycle 4% Reaction time from neutral to max spool at constant voltage 0.2sec Oil viscosity range 12-75 mm2/s Maximum startup viscosity 2500 mm2/s Oil temperature range 30 60 oC 59

PAGE 60

Figure 3-7. Servo valve showing the main parts (Source: Sauer Danfoss PVG 32 valve documentation) 60

PAGE 61

Guidance Sensor: Camera A video camera was chosen as one of the primary guidance sensors. The camera would enable the guidance system to look at the environment in front of the vehicle and segment the path from the background. From this information, the vehicle could navigate the desired path. The video camera selected for this application was the Sony FCB-EX780S block camera that is commonly used in commercial camcorders. Its analog video output is standard NTSC with both composite and s-video signal output, and is classified as high resolution with 470 TV lines. The camera has a vast number of features that can be adjusted, including a 25x optical zoom, focus, white balance, iris, shutter speed, etc. All of the functions can be controlled via RS-232 serial communications, and many functions can be used in manual or fully automatic mode. The camera was purchased without a commercial casing and so the camera was enclosed in an aluminum custom built casing with mounts for mounting on the vehicle. Figure 3-8. Video Camera, one of the primary path finding sensors 61

PAGE 62

Figure 3-9. Camera mounted on top of the tractor cab The Figure 3-9 shows the camera mounted on top of the tractor cab. Mounting the camera at a high altitude gives a good view of the path for the vision system to segment well. The camera is mounted at an angle of 30o to the horizontal. The field of view of the camera with this mounting configuration ranged from a distance of 5m on the ground in front of the vehicle to the infinite horizon of the path. The camera always initialized to the automatic settings on startup. The automatic settings included automatic shutter speed, automatic aperture control and 1x zoom. Frame Grabber Board The frame grabber is a PCI version of the FlashBus MV Pro, which was added to the computer managing the overall guidance system. The frame grabber converts the analog NTSC video signal received from the camera described above, to a digital 640 x 480 RGB bitmap image. The frame grabber is equipped with a video scaling processor that allows for PCI direct memory access (DMA). This allows for the digital image to be placed directly into system memory without the need for any CPU cycles. The frame grabber also has several features that 62

PAGE 63

include RS-232 serial communications, 12 Volt power, digital to analog converter, and programmable look-up-tables for manipulation of the RGB image. Figure 3-10. Frame grabber board A custom cable was attached to the video camera with the connector to interface with the frame grabber. The software was written to communicate with the frame grabber and obtain images to be processed to determine the path for the vehicle navigation. Guidance Sensor: Laser Radar The laser radar (ladar) system used in this study consisted of a Sick LMS-200 ladar sensor (Sick AG, Waldkirch, Germany). The LMS-200 is a 180 degree one-dimensional sweeping laser which can measure at 1.0/0.5/0.25 degree increments with a 10 mm range resolution and a maximum range of up to 80 m. The LMS-200 can be operated at 38.4 KBaud using RS232 protocol, giving a sweep refresh rate of approximately 7 Hz or at 500kBaud using RS422 protocol, giving a sweep refresh rate of approximately 35Hz. For operation at 500kBaud, a special high speed serial communication card was used. The ladar also consists of a 24V external power supply to power the unit. 63

PAGE 64

A B Figure 3-11. Laser radar and its mount. A) Laser radar. B) Laser mounted on top of the tractor. The laser radar is mounted on top of the tractor cab just below the camera. It is also positioned at 30o to the horizontal like the camera. This position allows for the laser radar to scan the front of the tractor for obstacles. The laser radar was used as the second guidance sensor to determine the path for navigation. Encoder For the purpose of feeding back the steered wheel angle to the control system, a rotary encoder was used. A Stegmann HD20 Heavy Duty encoder was selected. Figure 3-12. Stegmann HD20 Encoder The major features are 1024 pulses/revolution 8-24V input/output 64

PAGE 65

0.23Kg, 2inch diameter Max speed 3000rpm at high load Output: Two square waves in quadrature Figure 3-13. Output from the encoder shows two waves in quadrature As shown in the Figure 3-13, by measuring the phase shift between the two square waves, the direction of rotation can be calculated. The number of pulses after the last counted pulse gives the amount of rotation. Linkage Encoder Figure 3-14. Encoder mounted on the tractor axle 65

PAGE 66

The encoder was mounted on the shaft of the left wheel using a linkage assembly as shown in the Figure 3-14. When the steering was performed, the encoder assembly rotates and the signal from the encoder can be counted for measuring the steered angle. The encoder along with the linkage was calibrated. The amount of rotation of the encoder shaft is different from the amount of rotation of the wheel because of the linkage assembly connecting the encoder and the wheel. Computer An industrial PC was used for performing high level algorithm processing and managing the overall guidance system. The PC used in this research was the Advantech PCA-6106P3-C, Pentium 4, 2.4GHz processor containing two RS232 serial ports. It was mounted in the cabin next to the drivers seat along with the keyboard and monitor. The operating system used was Windows 2000 Professional. The software used for implementing the algorithms for vision and laser radar is Microsoft Visual C++.Net. Use of this software allows for the creation of GUI for analyzing the algorithm behavior in real time. One of the drawbacks of using windows is that it requires a time of about 15msec between successive runs of the algorithms, for housekeeping. This speed for processing is sufficient for the vision and ladar algorithms. But this is too slow to process the information from higher speed sensors such as rotary encoders. A B Figure 3-15. Instruments mounted in the tractor cabin. A) Computer. B) Computer, monitor and keyboard mounted in the cabin. 66

PAGE 67

Microcontroller For the rate of steering turn of the vehicle, a high sampling rate of the encoder was anticipated. Initially, the encoder was interfaced using a National Instruments data acquisition board interfaced with the computer. When initial laboratory tests were conducted to measure the output from the decoder, the encoder shaft was manually rotated and the pulses were read. It was found that high sampling rates could not be achieved using this interface setup. It was also anticipated that the operating speed of the Windows operating system may not be fast enough for high speed control. Therefore, the Tern microcontroller was used for low level high speed operations. A microcontroller was used for high speed low level control operations. The microcontroller used in this research was the 586 Engine controller board with a P50 expansion board from TERN Inc. It is a C++ programmable controller board based on a 32-bit 100/133MHz AMD Elan SC520 microprocessor. Its important features are as follows: 32 PIOs, 7 Timers, 19 channel 12 bit ADC, 8 channel 12 bit DAC, 2 quadrature decoders, 8 opto-couplers, 14 high voltage IO, Serial port and Ethernet. The microcontroller was used for processing encoder data and for controlling the servo valve for steering, by sending analog voltages. The microcontroller was mounted in the tractor cabin below the PC in a closed box. Figure 3-16. Microcontroller 67

PAGE 68

Amplifier Circuit The servo valve requires control voltage in the range of 2Vs -3 V to 2Vs +3 V with a supply voltage of Vs or, if the supply voltage is 12 V, then the control voltage is expected in the range of 3 V to 9 V. Therefore the off position or closed position of the valve is at 6V, which is the middle value. Voltages from 6V to 9V correspond to flow of fluid in one direction and 6V to 3V for the opposite direction. The microcontroller analog output has a range of 0-2.5 V. So to amplify and shift the voltage from 0-2.5 V to 3-9 V, an amplifier circuit was necessary. The circuit consisted of an LM324 Op Amp as the amplifier. The 12 V supply for the circuit is taken from the tractors battery unit. The input to the circuit is the 0-2.5 V analog output from the microcontroller. This 0-2.5 V range is scaled to 3 to 9 V analog voltage in the output of the circuit. This is the control voltage for the servo valve. It is observed that the tractor battery voltage varies from 12 V to 13 V after starting the tractor or when the throttle is increased. The amplifier circuit takes the same supply power as the hydraulic valve. Therefore whatever the supply voltage is from the tractor battery, the circuit scales it accordingly. Figure 3-17. Amplifier Circuit 68

PAGE 69

DGPS Receiver A DGPS receiver was used to measure the vehicle displacement while conducting tests to determine the dynamics of the vehicle. A John Deere Starfire Differential GPS was used. Figure 3-18. DGPS receiver mounted on top of the tractor cab RS-232 serial communication was used to communicate with the DGPS receiver and position information was obtained at a rate of 5Hz. According to the manufacturers specifications, it has real time accuracy better than 1ft. A 24V external battery was used to power the DGPS receiver. Power Supply The PC, the monitor and the laser radar require a supply voltage of 120 V AC and the hydraulic valve and the microcontroller require a 12 V DC. Power from the tractor battery is tapped and given as input to an inverter. The inverter supplies the PC, the monitor and the laser radar. The supply for the microcontroller and the hydraulic valve is taken from the cigarette lighter power source in the tractor cabin. 69

PAGE 70

Figure 3-19. Inverter mounted in the tractor cabin Inertial Measurement Unit In this research, an Inertial Measurement Unit (IMU) was used to measure the direction of travel of the vehicle/tractor. The IMU used was the 3DM tri-axial IMU from Microstrain, USA. Figure 3-20. Inertial Measurement Unit used in this research This IMU is a 3-axis sensor measuring 360o of yaw, 360o of pitch and 140o of roll. The measurement is done using arrays of magnetometers and accelerometers. RS-232 standard of serial communication was used to obtain the measurement values from the sensor. The IMU requires a supply voltage of 5.3V to 12V. Calibration of the IMU was performed before using it in the vehicle. Calibrations should be repeated when mounting the IMU directly over metal. The IMU was mounted on the computer inside the cabin for the tractor. The IMU was mounted such 70

PAGE 71

that the yaw measurement is zero when the rear of the vehicle points towards the geographic north. The sampling rate obtained from the sensor using the guidance program was about 40Hz. Speed Sensor An ultrasonic speed sensor was used to measure the forward speed of the tractor. The sensor used was the Trak-Star from Micro-Trak systems, Inc., USA. The sensor was mounted under the tractor body such that the ultrasonic waves emitted from the sensor hit the ground. The output from the sensor was digital voltage calibrated against speed by the manufacturer. The sensor was interfaced using a PIC microcontroller to convert the digital signal and give an output in RS-232 serial communication standard. The speed information was obtained using serial communication using the guidance program. Experimental Approaches and Theoretical Development Vehicle Dynamics, Calibration and Steering Control of Tractor Valve calibration The PVG 32 proportional valve used for steering control is a high performance servo valve. The amount of opening of the valve, and thereby the wheel turn, is dependent upon the control voltage applied and the time for which it is applied. The effect of a phenomenon like hysteresis is compensated by the built-in servo control in the valve to position the spool exactly at the commanded position. To understand how the wheels turned with voltage and time, calibration tests were conducted. These calibrations are important for the control system implementation. Tests were conducted to make a look-up table of voltage, time and the corresponding wheel angle. In this test, the tractor engine was started and the vehicle was kept stationary. A program was written in the microcontroller to send a constant analog voltage to the servo valve. The wheels were allowed to turn 100 degrees and the valve was turned off or closed. The time required to turn this angle was measured. Tests were conducted at various constant 71

PAGE 72

voltages. Voltage vs Time plot was plotted as shown in the Figure 3-21. From the plot, it can be observed that the true close position of the valve is at 6.8V rather than exactly at 6V. This could be due to variation of supply voltage from the tractor from 12V to 13V or due to the dynamics of the valve itself. It can be observed from the plot that for voltages further from the center value, the time required to turn is less as expected. The shortest time is 0.5s for 8.5V and 5.5V. Further values of voltages were not applied as such high speeds of turn are not expected of the tractor during autonomous guidance. For voltages less than 7.2V or more than 6.5V, there was no significant turn of the tractor wheels. It is also observed that right turn has a larger range of voltages compared to left run. This could be due to the dynamics of the valve and the dynamics of the various components of the hydraulic circuit. 44.555.566.577.588.590.60.750.91.254.3Time (sec)Voltage (Volts) Right Left Figure 3-21. Voltage and time required to turn 100 degrees It is to be noted that the flow of hydraulic oil in the lines is dependent on the viscosity of the oil. The viscosity of the oil is in turn dependent on its temperature. The viscosity of the oil varies from 12 mm2/s to 75 mm2/s, as the oil is heated up. During the course of the experiments, it was observed that an initial period of about 15 min is required after starting, for the oil to reach 72

PAGE 73

a constant viscosity. For experiments conducted in the field, the tractor was driven out to the field. This time was sufficient to heat the oil. As a result, no startup time was required before each experiment. Further, the valve dynamics frequency is much higher than the vehicle dynamics frequency. Therefore the effect of valve dynamics on the vehicle control can be ignored without severely degrading the performance. Encoder calibration The next step was the calibration of the encoder and the linkage assembly. As discussed earlier, the encoder outputs 1024 pulses per revolution. The encoder position had to be calibrated against the wheel angle. This calibration is necessary for the steering control. 0 15 30 15 45 30 45 wheel Figure 3-22. Encoder calibration The tractor was positioned at a place in the laboratory. Angular positions were marked on the ground as shown in the Figure 3-22. From the center position, the steering wheel was rotated to position the wheels at different angles. The number of pulses to reach different angles was noted using a program written in the microcontroller to decode the pulses from the encoder. The angle range for this calibration was from 0 to 45 degrees. The wheel center was treated as zero 73

PAGE 74

and turns were to a maximum of 45 degrees on either side from the center. It is to be noted that the wheel center was calibrated by trial and error. The maximum angle was 45 degrees as this was the mechanical turn limitation of the wheels. From this calibration, it was determined that a rotation of 1 degree caused the encoder to output 7 pulses. Vehicle dynamics The vehicle dynamics has to be determined for proper design of the control system. The vehicle dynamics relates the lateral motion of the vehicle to steering inputs. Open loop frequency response tests were conducted to explore the vehicle dynamics. The open loop frequency response test is good for modeling transfer functions from physical data. Open loop frequency response. The frequency response is a representation of the systems response to sinusoidal inputs of varying frequency. For the open loop frequency response, an algorithm was implemented in the microcontroller to move the wheel angle as a time based sinusoid. The computed voltage for the required frequency of turn was sent to the valve, from the microcontroller, which turned the wheels at various instances of time. The encoder fed back the wheel angle information. This closed loop control caused the vehicle to execute a sinusoidal motion. For the tests, the engine was set at a constant speed of 2200 rpm. Tests were conducted in an open field at three different vehicle forward speeds of 4 mph, 7 mph and 10 mph. These speeds were representative of operating speeds expected while navigating citrus groves. The frequency was varied from 0.07 Hz to 0.3 Hz. The lowest frequency was chosen as 0.07 because at frequencies below this and at maximum speed, the vehicle veered out of the test area. Above 0.3 Hz, the time period of wheel turn was very low, such that the very high velocity of turn was required. This is not suitable for the tractor. These constraints limited the bandwidth of tests that could be conducted. Time, wheel angle and vehicle position were recorded. The DGPS receiver was used to record the vehicle position. There were no tall trees or 74

PAGE 75

buildings near the vehicle in the test area to affect the GPS signals to the receiver. Using the position information collected with the DGPS receiver, the position of the vehicle was plotted in ArcView, software commonly used for analyzing GPS data. Table 3-2. Open loop frequency response tests at various speeds and frequencies Frequency (Hz) Speed (mph) 0.07 0.08 0.09 0.1 0.2 0.3 4 X X X X X X 7 X X X X X X 10 X X X X X X N 5 m Figure 3-23. Vehicle position plotted in ArcView (0.2Hz, 10mph) The vehicle lateral deviation plot in the Figure 3-23 shows the sinusoidal movement of the vehicle. It was observed that the vehicles lateral response contains a sinusoidal component at the frequency of the steering excitation. ArcView has a tool to measure distances on its plots. Using this tool, the gain or the change in the position of the vehicle from a positive peak to the successive negative peak was measured, as shown in the Figure 3-24. 75

PAGE 76

Ga in Figure 3-24. Measuring gain from the position plot This gain corresponds to a particular frequency at which the tractor was steered. This calculation was performed for all the frequencies and speeds used in the test. The gains were plotted against the frequencies (Figure 3-25). The Figure 3-25 shows the open loop frequency response for 4 mph, 7 mph and 10 mph. As observed, the frequency response was relatively linear. The gain relates the lateral deviation, d(s) of the vehicle to the steering angle change, a(s). Magnitude (dB) -0.2 -0.3 Figure 3-25. Open loop frequency response for 4 mph, 7 mph and 10 mph 76

PAGE 77

This is represented by the transfer function G(s) = )()(sasd (Eq. 3-1) Determining the Theoretical Model. From the open loop frequency response, it was observed that the vehicle lateral response has the same frequency as the steering excitation. It is also known that when a constant input is given, that is, if the steering angle is kept constant, the lateral response is initially exponential. An exponential response to a constant input is exhibited by a double integrator function. Double integrator also introduces a 180 degree phase shift in the response. Therefore, if the input is a sine function, the output is a cosine function shifted by 180 degrees, or otherwise, a negative sine function. From the bicycle model discussed in the literature review, it is known that the vehicle dynamics model has two poles. This information also corroborated that the vehicle model could be represented by a double integrator. Hence, the transfer function is determined to be of the form G(s) = s^2K (Eq. 3-2) The value of gain K depends on the forward speed of the vehicle. The frequency response of the transfer function, G(s), was plotted using different values of K, to fit the actual response (Figure 3-26). K was found to be 0.02, 0.07 and 0.13 for 4 mph, 7 mph and 10 mph respectively. It can be concluded that an increase in velocity can be represented by an increase in the gain K, under identical conditions. This model is similar to the bicycle model, except that the bicycle model has two additional pole-zero pairs. 77

PAGE 78

Magnitude (dB) Figure 3-26. Theoretical model for the three speeds To test the validity of the theoretical model proposed, the theoretical model was created in Simulink and simulated. The model is shown in the Figure 3-27 and consists of two integrator blocks connected to a gain. The input is a signal generator and the output is a scope. Sinusoidal wheel angle inputs were given at various frequencies. The experimental results from the open loop frequency response tests were found to agree with the simulated results as shown in Figure 3-28. Figure 3-27. Simulink model for the vehicle dynamics 78

PAGE 79

Figure 3-28. Sinusoidal response of the theoretical model and the vehicle (10 mph, 0.2 Hz) Control system Based on the open loop frequency response tests described previously, a control system was designed to control the steering angle. PID control system has been found to work well for vehicle guidance in the literature. It is also a simple and reliable method. Hence PID control system was chosen for implementation (Figure 3-29). Actual An g le Error in Dis p lacement Desired An g le Lateral Displacement Desired Displacement E Figure 3-29. Guidance control block diagram 79

PAGE 80

The steering control system is developed to interface between the PC and the servo valve and would be implemented in the microcontroller. The input to the PID controller would be the lateral displacement error of the vehicle from the center of the path to be navigated. The output of the PID controller would be the desired steering angle to reduce this error. This desired angle is input as analog voltage to the servo valve. The actual steering angle is fed back to the PID controller using the rotary encoder. This forms a closed loop control for steering. The lateral error of the vehicle from the path center is fed back using the primary guidance sensors namely vision and ladar. This feedback of the error forms the overall closed loop of the guidance system. The control system was first designed and simulated in Simulink. Simulations of the control system can give good estimates of the controller gains, which can then be further tuned in the actual vehicle. The Simulink model is shown in the Figure 3-30. A rate limiter was added to simulate the fact that the wheel turning is not instantaneous. A saturation block was added to simulate the maximum voltage that can be given as input to the valve. Figure 3-30. Simulink model of the vehicle control system 80

PAGE 81

Ziegler Nichols method of tuning for closed loop systems was adopted for initial tuning of the PID controller. In this tuning process, The system was given a small proportional gain. Integral and derivative gains were removed. The gain was increased until sustained oscillations were obtained as shown in the Figure 3-31. Figure 3-31. Simulated sustained oscillation with a gain of 1.8 for the 10 mph model The gain Gu and time period of oscillation, Tu were recorded. The required proportional gain Kp and derivative gain Kd were calculated using the Ziegler Nichols formula a. Kp = 0.6 Gu (Eq. 3-3) b. Kd = Tu/8 (Eq. 3-4) 81

PAGE 82

Figure 3-32. Simulated step response with the initial tuning parameters (10mph, Kp=1.08, Kd = 1.25) The parameters were further tuned and an integral gain was added for removing the steady state error. The controller was tuned for zero steady state error and low overshoot. A large settling time of the order of 20 sec was acceptable for the vehicle. This is because, if the controller were tuned for a small settling time, the steering was very sensitive to very small errors and caused the wheels to chatter. The optimum PID gains were found to be: Proportional Gain, Kp = 1 Derivative Gain, Kd = 4 Integral Gain, Ki = 0.09 When the control system was implemented in the tractor, the same parameters determined for the simulation were used to control the vehicle and no significant further tuning was required. 82

PAGE 83

Figure 3-33. Simulated step response with the final tuned parameters (10mph) Table 3-3. Performance criteria of the controller (Simulated results) Parameter Model Rise time (sec) Settling time (sec) Delay time (sec) Overshoot (cm) Steady state error 4 mph 15 10 8 1 0 7 mph 12 15 6 1 0 10 mph 12 20 5 2 0 Vision and Ladar Independent Steering The primary purpose of this project was to guide the vehicle through the alleyways of a citrus grove. However, the performance of the vehicle guidance needs to be tested well before testing in the grove. Also the location of the groves not being in proximity to the laboratory poses a challenge of towing the tractor for a long distance. So if a modification to the equipment or algorithm is required, a lot of time is wasted in towing the tractor. As a result, there was a need for the test area to be in the vicinity of the lab. 83

PAGE 84

It was decided that the tests could be carried out in an open area in the university campus. Hay bales were used to form an artificial boundary of a path and tests were conducted there. Hay bales were selected because it was expected that the tractor might not navigate very well during the initial stages and might run over any artificial boundary. The hay bales provided a compliant, yet 3D boundary that could be sensed by the vision and ladar sensors. Computer vision algorithms had to be developed to segment the path from the surroundings. Two algorithms had to be developed, one for the citrus grove environment and one for testing with hay bales. As discussed earlier, the camera was mounted on top of the tractor at an angle of 450 with the horizontal. The camera was pointed just in front of the tractor hood. This aided in looking right in front of the tractor. The height of the camera gave a good field of view for the vision algorithms. The mount of the camera and ladar are shown in the Figure 3-34. Figure 3-34. Camera and Ladar mounted on top of the tractor cabin 84

PAGE 85

Vision for path with hay bales Typical images as seen with the video camera mounted on the tractor are shown below. The images show hay bales placed in parallel rows separated by a distance of about 3m. These two hay bale rows are representative of tree rows in the grove alleyway. The path is the region between the hay bale rows. A B C Figure 3-35. Typical images seen while navigating through hay bales The images were taken with the tractor standing in the middle of the path. The vision algorithm should be capable of segmenting hay bales as boundaries and the region between the hay bale rows as the path. It is observed that the ground could have grass in some places, but not in all regions of the path. Also hay bales cast a shadow as does the tractor cabin. These shadows pose a challenge to developing the vision algorithms. 85

PAGE 86

The image size obtained from the video camera is 640pixels by 480pixels. The image was not resized. It is an RGB image with three values for each pixel, corresponding to red, green and blue values. Also the image is mapped from the bottom left as the reference. This reference is based on the Windows bitmap standard adopted in the software. The Figure 3-36 shows the image coordinates used in the algorithm. 0,0 640,0 0,480 640,480 Figure 3-36. Image Coordinates The vision algorithm is required to work effectively in real time. The operating speed of the algorithm is dependent on the speed of the vehicle. For this project, only one PC with the 2.4GHz processor was used to process all the high level algorithms. It was also anticipated that ladar based segmentation would also be required to run in parallel with vision. In the future, more algorithms using sensor fusion, headland turning, path planning and obstacle detection were also anticipated to be implemented. Therefore to run all these algorithms in parallel in a single processor and to have a real time vision algorithm, color based segmentation was selected as the method of segmentation, as it has been found to work well in other researches for navigating tractors through crop rows. Color based segmentation is also relatively fast compared to other complex methods such as texture based classification methods. Thresholding. Thresholding was the first process in the vision algorithm. No preprocessing such as filtering was done to the base image. This was because, preprocessing increased the processing time without offering any significant improvement in segmentation. 86

PAGE 87

Several images of the path in the hay bale track were captured. Then the Red(R), Green(G), and Blue(B) intensity color plots of different regions in the image were plotted. The plots are shown in the Figure 3-37 and Figure 3-38 corresponding to hay bales and path in between. Figure 3-37. R, G, B color plots of hay bales (X axis: Intensity-Range: 0-255, Y axis: no. of pixels) Figure 3-38. R, G, B color plots of grass path (X axis: Intensity-Range: 0-255, Y axis: no. of pixels) From the intensity plots, it is observed that segmentation based on color is feasible as the hay bale and path are clearly separable in the three color channels. These plots also give an estimate of the threshold value to be chosen. From the plots, it is observed that the path has color intensities below the threshold whereas the hay bales have color intensities above the threshold value. Color plots of several images were observed and the threshold values for red, green and blue were chosen by comparing pixel values of hay bales and ground. Once the threshold values were chosen, all the pixels in an image are classified. The classification is performed such that a pixel with R, G, B intensities corresponding to hay bale or intensities higher than the threshold is changed to white whereas a pixel with path color intensity values or intensities below the threshold value is changed to black. The result of thresholding is a binary image. The result of thresholding a sample image and converting to binary is shown below. 87

PAGE 88

Figure 3-39. Thresholded image Cleaning or filtering the image. As observed, the thresholded image has a lot of noise where lots of isolated pixels on the path are classified as hay bale because of the higher intensity of some regions without grass. This isolated pixel noise needs to be cleaned. For this, a 20 pixel by 20 pixel window is created and moved through the image. In each window, the number of the white pixels is counted. If more than 75% of pixels falling inside the window are classified as white, then the window is determined to fall on a hay bale. Therefore, the remaining black pixels in that window are changed to white. If less than 75% of pixels falling in the window are white, then the pixels in the window are determined to be path and all the white pixels are changed to black. This process is performed for the entire image. A cleaned image is shown below. The cleaning process has removed the noise and formed clearly defined hay bale objects or path. Figure 3-40. Cleaned image 88

PAGE 89

Isolating the left and right boundaries. As observed in the previous step, a part of the path region is still classified as hay bales because of its color. For removing these incorrectly classified regions, each row in the image is scanned from left to right, as well as right to left. The width of the hay bale region or white region in each row is calculated for both left and right boundaries. The mean width of the hay bale is chosen apriori. Based on this information, any hay bale region that is much wider than the mean width is shrunk to the mean width, as shown in the Figure 3-41. Figure 3-41. Boundary Isolated image Line fitting. As observed in the boundary isolated image in the Figure 3-41, the boundaries are jagged. To clearly define the path, lines could be fit to indicate the path boundaries. For this, each row of the image is scanned and the interface between the path and hay bales is identified for each row. This is determined by a transition from white to black or vice versa. The transition point of each row is stored in an array. Then a line is fit using the least squares method. Equation of a straight line is given by y = ax + b (Eq. 3-5) 89

PAGE 90

a = 2)^(2^))(()2^)((xxnxyxxy (Eq. 3-6) b = 2)^(2^))(()(xxnyxxyn (Eq. 3-7) Where, (x, y) is the pixel coordinate, a and b are constants, n is the number of points for which the line is fit. Lines Figure 3-42. Lines fit to the boundaries The original image after processing and line fitting is shown above. Note that the line fit for the left boundary has a shift to the left of the true boundary due to the presence of shadows on the hay bale. The flowchart of the developed vision algorithm is shown below. Path Center. Once the lines are fit for both left and right boundaries, the next step is determine the path center. The region between the lines is determined as the path from the previous steps. The entire image is scanned row by row. At a row n, the position of the left and right boundaries are known based on the line fitting as Xleft[n] and Xright[n]. The center point of the path in that row, Xcenter[n] is determined as the mid point between the line locations in that row as 90

PAGE 91

Xcenter[n] = (Xright[n] Xleft[n])/2 (Eq. 3-8) This process is repeated for each row and all the center points are stored in an array. The average of the center points is chosen as the current center of the path in the given field of view as Xcurrentcenter = ()/n (Eq. 3-9) niiXcenter1][ The current position of the vehicle center is chosen as the mid point of the image at column 240 i.e. half of the image width (480). The error of the vehicle from the path center is determined as Error = Xcurrentcenter 240 (Eq. 3-10) This error in pixels is calibrated against real distances using prior calibration of the field of view. This transformation is explained in a subsequent section. Once the error is determined, the error is passed to the control system in the microcontroller using RS232 serial communication. The flowchart of the entire vision algorithm is shown below. Algorithm flowchart Input Image Threshold Clean the image by statistically cleaning regions 91

PAGE 92

Isolate the left boundary if it exists Isolate the right boundary if it exists Fit lines to the left and right boundaries Send error to the Microcontroller via serial communication Calculate path center, present position and error Figure 3-43. Machine vision algorithm flowchart Vision for citrus grove Once the vision algorithm was developed for navigating the test track of hay bales, the next step was to adopt the vision algorithm for navigating the citrus grove alleyway. The vision algorithm for citrus grove navigation follows the same procedure as for the hay bales path. However a special procedure has to be followed for thresholding effectively. Typical images of the alleyway seen by the camera in the grove are shown below: 92

PAGE 93

Figure 3-44. Typical images for navigation in citrus grove It was expected that during the course of the day, as the suns position changes, the amount of shadow on the trees and the path varies. Also the color saturation of the trees and the path varies to an extent. It was felt that an analysis of this variation is necessary to ensure proper design of the algorithm. In the months of July and October of 2004, trips were made to the grove for three consecutive days. Two cameras were mounted on a tripod on the hood and top of a truck to simulate the camera being mounted on the tractor. Two cameras were used to study the effect of the height of the camera on the image. Pictures were taken from 9 am to 6 pm at half-hour intervals. These pictures helped to analyze the intensity variation in the images over an entire day. The effects of clouds were also included. These images helped rule out concerns about varying light conditions. 93

PAGE 94

Figure 3-45. Picture collection in the grove As observed with the sample images shown earlier, in some cases, the shadow tends to cover a large portion of the path. This type of shadowing occurred mainly in the images taken at early mornings till about 10am and evenings after 4pm. During the middle of the day, the brightness in the image was evenly spread without much shadow. This variation can be attributed to the motion of the sun through the day at various angles to the horizon. Another variation of the brightness or shadow occurs between summer and winter because of the change in suns declination to the earth. Another cause of variation is due to the direction in which the rows are planted i.e. north-south or east-west or any other direction. Among all these variations, the common factor is that the brightness can change during the day and the shadows may or may not be present. Segmentation. In the field of view of the camera in the grove, is the grove alleyway. This scene consists of trees and path or ground. This can be seen in the images shown before. As with 94

PAGE 95

the hay bale track, the R, G, and B intensities of trees and ground can be plotted and is shown in the Figure 3-46. This plot was created using a images taken from the database corresponding to both trees and ground. It is observed that for a threshold value of 100, trees and ground can be segmented well based on color. R G R G Trees B Path B Intensity Pixels Figure 3-46. R, G, B intensity plot for ground and trees in grove alleyway However, to accommodate the changing shadow or brightness scenario, the thresholding algorithm has to be modified. This method uses the prior information that an image with a lot of shadows tends to have dark trees on one side and bright trees on the other. In Figure 3-47, the two extreme images are shown where on image corresponds to a uniform brightness and the other corresponds to image with shadows. In the images with shadows, trees on one side are darker than the trees on the other. That is, they have very low intensities on one side of the trees. 95

PAGE 96

A B C D high intensity high intensity low intensity low intensity low intensity Figure 3-47. Images and their green intensity profile across the center of the image. A) Non shadow image. B) Shadow image. C) Profile of A. D) Profile of B. In the Figure 3-47, the plots show the green channel intensity profiles of non-shadow and shadow images, respectively. In non-shadow images, there is low intensity on left side corresponding to the trees on the left, high intensities in the middle corresponding to the path and again low intensities in the right corresponding to the trees. For the shadow image, there is high intensity on the left corresponding to one side of the trees and half the path, and low intensities on the right corresponding to half the path and the right-side trees. It should also be noted that in the shadow images, the low intensities are mostly below 50 and in non-shadow images, the low intensities are mostly below 100. In this method, it is initially assumed that there are many shadows in the image. A threshold value corresponding to this is used; i.e., of the order of 50 and the image is thresholded. 96

PAGE 97

A B C D Figure 3-48. Thresholding. A)Shadow image. B) Non shadow image. C) Thresholded shadow image. D) Thresholded non-shadow image. Figure 3-48 shows how a shadow image and non-shadow image are segmented by using a low threshold. After thresholding, the number of pixels classified and the locations of classified pixels in the image are calculated. If the assumption that it is a shadow image was true, a lot of pixels are classified as trees in one half of the image, corresponding to the shadow image profile. This way, trees on one side have been found. Then the algorithm proceeds as with the hay bales, but finds trees only on one side. Then the vision algorithm tries to keep the vehicle at a fixed distance of half the aisle width from one side which has been identified as trees. If the assumption was not true, then not many pixels are classified as trees on one side. Then it is concluded that there is not much shadow in the image and a new threshold value, higher than the previous value, is used. Now the trees on the two sides are found leaving out the path (Figure 3-49). The algorithm follows the rest of the procedure as with hay bales. The algorithm finds the path center between the trees on either side. 97

PAGE 98

A B Figure 3-49. Classified images without and with shadows In the Figure 3-49, the left image is a segmented non shadow image and the image on the right is a segmented shadow image. Based on the positioning of the camera, it is observed that the lower half of the image contains information corresponding to a distance from 10ft to 20ft in front of the vehicle. It was decided that the useful information for keeping the vehicle in the center of the path is in this lower half of the image. Hence processing is restricted to the lower half of the image. This reduces the processing time considerably. Error calculation. The camera is mounted on the center of the vehicles chassis. The current lateral position of the vehicle corresponds to the image center. That is, the image coordinate of the current position is 2_widthimage = 320. 98

PAGE 99

Desired position = (Right side tree boundary Left side tree boundary) / 2, for non-shadow images (Eq. 3-11) or Desired position = Right side tree boundary Fixed distance, for images with shadow (Eq. 3-12) Error = Desired position current position (measured in pixels) (Eq. 3-13) Pixel distance to world distance conversion. It is necessary to determine the path errors in real world distances by converting pixel distances to real world distances. The camera is positioned at a fixed angle, and so the look-ahead distance is fixed. A meter ruler was placed in front of the tractor in the cameras field of view. From the pixel length of the ruler seen in the image, the pixel to world conversion was determined as 100 cm = 177 pixels However, an object in the lower half of the image looks bigger than if it were in the upper half of the image. A correction factor of 10 cm is necessary depending on where the distance is measured in the image. This variation was taken into account in the algorithm. Error in position = Error in pixels x 100/177 (cm) (Eq. 3-14) The real distance error information is then passed to the control system implemented in the microcontroller. The control system uses this error to steer the vehicle to the center of the path. The flowchart of the overall vision algorithms is given in the Figure 3-50. The operating speed of the vision algorithm, that is, the process of getting an image and processing it to obtain the path navigation error is performed at a rate of about 15Hz-20Hz. 99

PAGE 100

Figure 3-50. Machine vision algorithm flowchart Required heading of the vehicle using vision. The vision algorithm uses color based segmentation of trees and ground followed by morphological operations to clean the segmented 100

PAGE 101

image. The path center is determined as the center between the tree boundaries. The error is calculated as the difference between the center of the image and the path center identified in the image. The error is converted to real world distance using prior pixel to distance calibration. To calculate the required heading of the vehicle, a line is fit for the path center in the image representing the entire alleyway. The angle between this line and the image center line is determined as the required heading for the vehicle to navigate the alleyway with low lateral error. Vehicle Projected Center True Center True Center Required heading Figure 3-51. Calculating the required heading of the vehicle The Figure 3-51 shows the result of vision based segmentation. The solid line(blue) shows the true path center of the path and the dashed line(red) shows the current center of the vehicle. The angle between the true path center line and the vehicle center line is the required heading of the vehicle in the path for long term accurate navigation. Laser radar (Ladar) for path detection The ladar is a centimeter level accurate planar distance measuring sensor. It is well suited for navigation problems in which the vehicle must navigate through narrow passages. The path 101

PAGE 102

width in an orange grove is not big enough for the tractor to avoid an obstacle by going around it. However, ladar can be used for navigating through the pathway. Data Obtained from ladar. The ladar used in this study was a SICK LMS 200, which scans horizontally through an angle of 1800 or 100 0 depending on the mode selected, giving a distance value corresponding to the angle. The ladar was operated at 38,400 baud using RS232 serial communication for guiding the tractor. The ladar can be operated in several modes, e.g. 1800 at 1 degree or 0.5 degree increments, and 1000 at 1 degree or 0.5 degree increments. For navigating the test path, a scan of 100o at 0.5 degree increment was used. This was because the path and the boundaries could be detected within 100 degree range and 0.5 degree increments give good accuracy. The ladar transmitted data at a rate of 7 Hz using RS232. 050010001500200025003000020406080100120140160180Angle (Degrees)Radial Distance (Cm) Figure 3-52. Radial distance plotted against angle (180 degree at 0.5 degree increments) 0 Hay Bale Ladar 18 0 Window Ha y bale 102

PAGE 103

For the explanation of the ladar based path finding, consider the ladar mounted on the tractor and scanning the hay bale track in front of the tractor from 0 to 180 degrees at 0.5 degree increments. The Figure 3-52 shows the radial distance as a function of the angle obtained from the ladar when scanning the test path with hay bale boundaries. The two major discontinuities or depressions in the plot correspond to the radial position of the hay bales with respect to the ladar. Error Calculation. For determining the exact location of the hay bales, a data window of length 4 distance measurements is moved across the distance data from 0 to 180 degrees. When all of the four positions in the window encounter a drop below a threshold value, it is decided that that there is hay bale corresponding to the left boundary. After finding one hay bale, the window is moved until it finds another hay bale, which corresponds to the right side boundary. The sliding window also eliminates any point noise. Since the ladar is placed at the center of the tractor, the distance of the tractor center from the hay bales can be calculated as Distance = Radial Distance at the hay bale cosine (Angle at that point) (Eq. 3-15) Using the above formula, the error of the tractor position in the path is calculated as Error = (Distance from right boundary Distance from left boundary) / 2, in cm (Eq. 3-16) As with the vision algorithm, the error information determined above is sent to the control system implemented in the microcontroller using RS232. Using this error, the steering control system tries to get the tractor back to the center of the path. The flowchart of the ladar based guidance algorithm is shown in the Figure 3-53. The operating speed of the process of collecting the ladar data and processing the data to find the path navigation error is about 7Hz. This speed is limited by the RS-232 serial communication standard. 103

PAGE 104

Get scan data Move window and look for discontinuity Find hay bale on one side Continue Moving window and look for discontinuity Find hay bale on other side Find path center as mid point between hay bales Path Error = Path center current position Figure 3-53. Ladar algorithm flowchart Serial communication between PC and microcontroller RS 232 serial communication was used to send error information from the PC to the microcontroller based on the calculations from vision and ladar algorithms. Communication was 104

PAGE 105

established at 115,200 baud rate. A protocol was required for communication. The microcontroller had the capability to read one byte at a time. A simple protocol of sending a predefined character string followed by 4 bytes of information was designed. Character Sign Highest byte Middle byte Lowest byte Figure 3-54. Serial Protocol The character byte indicated the start of a new data. The sign byte indicated the sign of the error, i.e. a negative sign indicated a left turn and a positive sign indicated a right turn. Using a resolution of 1cm, three bytes were sufficient to send error information up to 1000 cm or 10 m, making the range -10 m to +10 m. A higher path error is not expected to occur in the known environments. Operating speed of the guidance system The operation of the different sensors, vision based guidance and ladar based guidance system are discussed above. It can be observed that all of these operate at different speeds. What about the operating speed of the overall guidance system? The software for the guidance system was developed such that each sensor is operating independent of the other. The vision process updates the path error at a rate of about 15Hz. The steering voltage to the servo-valve is updated at a rate of about 50Hz from the microcontroller. However, the steering update is a filtered and smoothed update to prevent chattering of the hydraulic valve. The slowest sensor process would be vision for vision based guidance and would be ladar for ladar based guidance. These two primary sensors and their accompanying algorithms limit the path update speed of the overall guidance system. Therefore, it is not a single speed that governs the entire guidance system. 105

PAGE 106

Sensor Fusion Based Autonomous Steering Performance of independent sensor based autonomous steering Machine vision based autonomous steering. The development of an autonomous vehicle guidance system for citrus grove navigation using machine vision based guidance systems was discussed earlier. In custom designed test tracks, machine vision guided the tractor through the path with a maximum error of 6.1cm from the path center when traveling at a speed of 4.4m/s. From the tests conducted in a test path. It was found that the vision based guidance was able to keep the vehicle in the middle of the path in several types of paths such as straight and curved paths. The machine vision based guidance system also guided the tractor through an alleyway of a citrus grove, keeping the vehicle, visually, in the center of the path with no collision with the trees. In the grove, the uneven tree canopy in citrus grove alleyways pose problems for guidance when using vision alone. With vision based guidance configuration used in this research, for example, with a forward facing camera, branches closer to the vehicle than the cameras field of view are not accounted for in the vision based guidance. Also, navigating locations near the end of the row is not feasible with the vision system configuration as the last trees go out of the field of view of the camera. Ladar based autonomous steering. In custom designed test tracks, ladar also guided the tractor through the path with a maximum error of 6.1cm from the path center when traveling at a speed of 4.4m/s. From the tests conducted in a test path, it was found that ladar based guidance was more accurate than vision based guidance in well defined paths within a speed of 3.1m/s. In citrus grove conditions, the ladar mount configuration used in this research, with a wider field of view and close range is useful for guiding the vehicle with a higher accuracy than vision. However, in a given alleyway, it is not uncommon to find a few trees absent. In such cases, ladar based guidance gives erroneous paths. Also, the short range accurate navigation does not account 106

PAGE 107

for longer range changes in the path and sometimes causes jerky motion of the steering. The ladar based steering algorithm developed in the test track is not directly adaptable to the grove environment. While, in the test track, the path boundaries using hay bales were solid objects, the tree canopy in the grove is not. This generates significant noise in the ladar data for the algorithm to be rendered inefficient. Therefore, filtering of this noise is required when using in groves. Need for sensor fusion based autonomous steering The machine vision based autonomous steering method is easily adaptable from a custom designed test track to the citrus grove environment. Its long range aids in faster decision making before a hard turn is to be performed or for long distance detection and is quite robust to minor variations in the path boundaries. However, it lacks the centimeter level accuracy of laser radar in detecting the distance to close range path boundaries or obstacles. The ladars accuracy is useful in keeping the vehicle accurately positioned in the path. However, this accuracy makes the information noisy if there are minor variations in the path boundaries. Fusing the complementary information from these sensors, namely the long range information from vision and short range accuracy of the ladar, would be beneficial. Several methods for fusion have been reported in the literature. In our research, the tree canopy causes the sensor measurements to be fairly noisy. So the choice of method was narrowed down to be the one that could help in reducing the noise and also aids in fusing the information from the sensors. Popular methods such as neural network, behavior based methods; fuzzy logic and voting are not widely used primarily for reducing noise. Kalman filtering is a widely used method for eliminating noisy measurements from sensor data and also for sensor fusion. Based on the good results obtained in the previous research for fusion and noise reduction, a Kalman filter was selected as the method to perform the fusion and filter the noise in sensor measurements. During the development of the ladar based guidance system, the speed of 107

PAGE 108

the ladar data update was about 7Hz and was limited by RS-232 serial communication. At the time of development of this sensor fusion based approach, a high speed serial communication card was available and so the operating speed was about 30Hz. Kalman filter for autonomous guidance Kalman filtering is a recursive solution to data filtering problem. In other words, it is a set of equations that provides a recursive method to estimate the state of a process such that the mean squared error is minimized. There are several forms of the Kalman filter implementation algorithms. In this research, one simple discrete method of implementation was used. In this form, the Kalman filter estimates a state at a given time and then obtains measurements as feedback to improve future estimations. This could also be called as a prediction followed by correction type method. The equations of the filter performing the prediction are called time update equations and the equations of the filter performing the correction are called the measurement update equations (Figure 3-55). Time Update Measurement Update Predict Update Initialize Figure 3-55. Illustration of Kalman filter operation The time update equations are x(k+1) = A x(k) + w(k) (Eq. 3-17) P(k+1) = AP(k)AT + Q(k) (Eq. 3-18) where x is the state vector, k is the sample number, A is the state transition matrix relating the state at sample k and k+1, w is the noise vector of the process, Q is the covariance matrix of 108

PAGE 109

the process noise and P is the state estimate error covariance matrix. P gives a measure of the estimated accuracy of the state estimate x(k+1). The 1st equation of the time update performs the operation of estimating the state one sample ahead and the 2nd equation projects the error covariance one sample ahead. Once the time update equations are executed and the states have been predicted for the next sample k+1 using the above equations at sample k, the system performs its operations and reaches the next sample k+1. At sample k+1, measurements are obtained. The measurement can be related to the state using the equation z (k) = H x(k)+u(k) (Eq. 3-19) where z is the vector of measurements, H is the observation matrix and u is the vector of measurement noise. This equation is not directly used in implementing the filter, but helps in understanding the operation of the filter. The measurement update equations used in the filter are G(k)=P(k+1)HT (H P(k) HT + R)-1 (Eq. 3-20) x(k+1)=x(k) +G(k)(z(k)-H x(k)) (Eq. 3-21) P(k+1)=(I-G(k) H) P(k+1) (Eq. 3-22) where G is the filter gain and R is the covariance matrix of the measurement noise. The first step in the measurement is to compute the filter gain. The filter gain is used to make a posteriori estimate of the state x at sample k+1, using the measurement z as given in equation 3-21. The filter gain G is the factor used to minimize the posteriori error covariance P. Finally a posteriori estimate of the state estimate error covariance P is calculated as given in equation 3-22. These new estimates for x and P are used to predict the state at sample k+2 and so on. Further details of the Kalman filter are given in Zarchan (2005). 109

PAGE 110

In the first step to use the filter, a model of the real world must be created by using a set of equations in the state space form. For this, let us first define some variables for this research. Figure 3-56. Vehicle in the path with the state vector variables As mentioned before, the aim is to filter the data from both vision and ladar to remove noise and to use the complementary information obtained from them for better navigation. Therefore, the error of the vehicle from the path center is the primary variable being considered for fusion and filtering. In the Figure 3-56, the vehicle is shown in the path at a certain possible arbitrary position to the left of the path center at time step t. From concepts in physics, it is known that given a position of a vehicle, to predict the position of the vehicle after some time, it is necessary to know the velocity of the vehicle and the direction of travel. In this project, the velocity of the vehicle can be obtained using an ultrasonic speed sensor on the tractor and the direction of the vehicle can be obtained using the IMU mounted in the tractor. Time update model. From the Figure 3-56, let the path error be represented by d R, the forward velocity of the vehicle v R and the change in heading be R. The path error at the next time step t+1 can be written as d(t+1) = d(t) + v sin( ) (Eq. 3-23) D V Path b oundar y R d v Vehicle Path Cente 110

PAGE 111

Depending on the direction of the vehicle, the path error could either increase or decrease. The operating speed of the vision algorithm is about 15Hz and the ladar algorithm is about 30Hz. The sensor fusion based guidance update would be limited by the operating speed of the vision algorithm which is 15Hz. The common operating speed of a tractor in the grove is about 3.1m/s. Comparing these values, for one sample update, the vehicle would travel a distance of 3.1m/15 = 0.2m or 20cm. For a forward travel of 20cm, the heading of the vehicle is not expected to change significantly. The required heading of the vehicle, R R is also not expected to change significantly. The tractor does not possess speed control and so the tractor was expected to be operated at a constant speed during the experiments. Therefore, the heading ( ) and the speed(v) cannot be expected to change significantly between successive time steps. Therefore it would be reasonable to state that (k+1) = (k) (Eq. 3-24) v(k+1) = v(k) (Eq. 3-25) R(k+1) = R(k+1) (Eq. 3-26) The heading and the velocity directly affect the path error d, it is reasonable to filter out noise from these measurements as well. The required heading of the vehicle can be used to limit the overall direction of travel and so it is an important variable. Therefore, all of these variables were included in the state vector x R4 of the Kalman filter. x(k) = [d(k) (k) R(k) v(k)]T Based on the above reasoning, the state transition matrix, A(k) R4x4 is defined as A= 100001000010*001Sint 111

PAGE 112

The change in heading is calculated using the measurements from the IMU at each time step. As observed, this is a measurement value obtained from a sensor. The use of a variable sensor input such as in the state transition matrix is consistent with the method described in Kobayashi et al. (1998). It is to be noted that only the change in heading is used for estimating the position error, which affects the overall guidance. The absolute heading is included in the filter to remove noise from the IMU measurements. The process noise w is assumed to be white noise with a mean value of zero. Hence w=0. The estimation process is difficult to directly observe, hence the process noise covariance is uncertain. Therefore the initial variance values, which are the diagonal elements in the matrix, were assumed as follows: variance for process position noise 2cm, variance for process current heading noise 0.01degree, variance for required heading noise 0.01degree and variance for process velocity noise 10-4 m/s. These values were intuitively assumed to be the amount of noise variance that the process could add. As discussed further in this paper, the process noise covariance matrix Q, is updated at each step of the operation of the Kalman filter. Therefore, the initial assumptions of the matrix values do not significantly affect the overall guidance of the vehicle. Kalman filter can work even when the precise model of the system is not available. These sensor measurements were assumed to be independent of the other sensors. This is done for simplicity and later it is observed from simulations and experiments that the guidance of the vehicle, with this assumption, is accurate enough to keep the vehicle close to the alleyway center. Hence the covariance values, which are the off-diagonal elements, are zero. Q == VXq000000000000qRCqq 10000001.0000001.0000024112

PAGE 113

Using this model of the process and the values for the parameters, the time update equations of the Kalman filter can be implemented. Kalman filter can be considered as a subset of the statistical methods because of the use of statistical models for noise. Measurement update model. To implement the measurement update equations, the measurement model is required. The measurements obtained from the various sensors and which affect the variables in the state x are the lateral position error of the vehicle in the path from vision algorithm, denoted by xC(k) R, the lateral position error of the vehicle in the path from ladar algorithm, denoted by xL(k) R, the required heading of the vehicle determined from vision algorithm, denoted by C(k) R, the current heading of the vehicle measured using IMU, denoted by IMU(k) R and the vehicle linear velocity measured using the speed sensor, denoted by v(k) R. Therefore, the measurement vector z R5 can be given by z(k) = [xC(k) xL(k) IMU(k) C(k) v(k)]T Comparing the variables in x and z, it is observed that the path error d is affected by the measurements xC and xL. The current heading of the vehicle is affected by the measurement IMU and the required heading R is affected by the measurement C Therefore the observation matrix H can be defined as H = T10000010000010000011 The measurement noise, denoted by u(k) is assumed to be a Gaussian with zero mean with standard deviations estimated experimentally with the different sensors, i.e. u =0. To determine the measurement noise covariance matrix R, the vehicle was set stationary in the middle of the path, each sensor (except the speed sensor) mounted on the vehicle was turned on independently 113

PAGE 114

and the information from the sensors was collected for 30 seconds. For the camera, the vision algorithm (Subramanian et al., 2006) was run for 30 seconds and the path error and the heading angle determined was recorded. For the ladar, the ladar path segmenting algorithm (Subramanian et al., 2006) was run for 30 seconds and the path error measurements were recorded. For the IMU, the sensor was run for 30 seconds and the heading angle measured by the sensor was recorded. For the speed sensor, the vehicle was run at 3 different constant speeds and the speed sensor measurement was collected for 30 seconds. The measurement noise covariance matrix, denoted by R, was determined from these recorded measurements and was found to be R == VimucXlXc00000000000000000000 0000000001.0000000017.00000015.00000007.1 where the variance of the error in lateral position of the vehicle, denoted by Xc was determined from the vision algorithm measurements, the variance of the error in lateral position of the vehicle, denoted by Xl was determined from ladar algorithm measurements, the variance of the error in heading, denoted by c was determined from vision algorithm, the variance of the error in heading, denoted by imu was determined from the IMU measurements and the variance of the error in velocity denoted by V was determined from the speed sensor measurements. xC and C are measurements from the same camera. However, the vision algorithm uses different regions of the image to estimate them. xC, xL, IMU and v are measurements from different sensors operating independent of each other. These measurements were assumed to be statistically independent. Therefore, the different sensor measurements were assumed to be statistically independent of each other. Hence the covariance values are zero. The 114

PAGE 115

variance of the velocity measurements was zero. This can be attributed to the low resolution (1mph) of the speed sensor. Using the above parameters, it is possible to implement the measurement update equations. Reliability factor of primary guidance sensors in the Kalman filter and fuzzy logic sensor supervisor Reliability of the sensors. A problem in using a simple Kalman filter in our research is that the reliability of the information from the sensors depends on the type of path in the grove. As discussed previously, ladar is accurate at short range for the given mounting arrangement and machine vision is good at providing overall heading of the vehicle. By experimentation in a citrus grove, the following observations were made: tree foliage is highly variable, trees can be absent on either or both sides of the path and some trees can be small enough for ladar to not recognize them as trees. Therefore, if a white noise model is assumed for the measurements, the reliability factor of a sensor in the Kalman filter has to be constantly updated. In such a variable path condition, it is not feasible to have constant noise values for the ladar and vision i.e. have constant measurement noise covariance matrix R. The covariance matrix R tells the Kalman filter about how believable or true the sensor measurements are. If the covariance value of a measurement is low, the filter believes the measurements to be accurate or reliable and if high, the filter believes the measurements to be erroneous. This type of update can be used to update the filter on the nature of the sensor at different locations of the grove. For example, when the ladar does not recognize trees on either side, guidance is more reliable when the measurement noise for the ladar is increased to a high value such that vision takes over as the reliable guidance sensor and for example when the vehicle gets to the end of a tree row, vision is not useful anymore as the citrus trees are absent from the field of view of the camera and so ladar can be made the sole guidance sensor to cross the last tree by increasing the measurement noise 115

PAGE 116

corresponding to vision measurements. A method was required to perform this update to R based on the presence of trees in the field of view of the sensor. A fuzzy logic system was used to decide the reliability of the sensor. Fuzzy logic was chosen as the method because it is intuitive to implement and the knowledge to implement fuzzy logic was available from a previous research. The reliability is changed in the Kalman filter by changing the measurement noise covariance matrix R. This is discussed in the following section. Fuzzy logic inputs. A fuzzy logic based supervisor was used to decide which sensor is more reliable at different locations in the grove alleyway. The input to the fuzzy logic supervisor is the horizontal distance of the vehicle center line from the trees on either side of the vehicle. This distance is in a direction perpendicular to the direction of motion of the vehicle. Both vision and ladar algorithms input their corresponding distance values to the fuzzy logic supervisor. Altogether, there are four input values; vision left tree distance, vision right tree distance, ladar left tree distance and ladar right tree distance. These inputs are sent to the Fuzzifier. The structure of the overall fuzzy logic supervisor is shown in the Figure 3-57. Inference Engine Vision Left Figure 3-57. Fuzzy logic implementation Fuzzy quantification of knowledge. These inputs are then divided into three linguistic variables: reasonable, unreasonable and zero. The input membership functions are shown below. The horizontal axis is the distance values and the vertical axis is the corresponding normalized value on a scale of 0 to 1. The linguistic variables were chosen based on the input distance values. Fuzzifier Rule Base Defuzzifier Vision Vision Right Ladar Ladar Left Both Ladar Right 116

PAGE 117

Error! Unreasonable Zero Reasonable 1 0 1m 2m 3m Figure 3-58. Membership function for linguistic variables x The input membership function relates the input x, which is the distance value, to the linguistic variables reasonable, unreasonable and zero. The meaning of the linguistic variables is literally whether the distance from the tree row is reasonable or not. A zero membership for the ladar indicates the presence of obstacle in the path and for vision, the end of the row. x takes the value of zero for 0 to 1m, 1m to 3m as reasonable and any value greater than 2m as unreasonable. As mentioned earlier, the vision and ladar sensors are mounted along the center line of the vehicle. If the vehicle were properly navigating the alleyway, the distance of the row trees from the sensors should be a minimum of 1m to account for the vehicles width and the trees should be at a distance of maximum of 3m to be present in the row in which the vehicle is navigating. Therefore, the reasonable range for a row tree would be 1m to 3m. 1m< x < 3m : Reasonable distance of tree (Eq. 3-27) If the tree is detected beyond a distance of 2m, the tree is unlikely to correspond to a row tree in the current row and is possibly a tree in an adjacent row. Therefore, this is an unreasonable distance for a row tree. x > 2m : Unreasonable distance to tree (Eq. 3-28) If in the vision algorithm, no tree is detected in the field of view or if a tree is detected in the middle of the path, then this is likely to be a case of the vehicle coming to the end of the row or near the headland. If a tree is detected in the middle of the path, it is likely to be a headland 117

PAGE 118

tree not part of the row. Detecting a tree in the middle corresponds to a horizontal distance less than 1m from the vehicle center line. This corresponds to the linguistic variable zero. Not detecting any tree is also placed in the linguistic variable zero for convenience. If in the ladar algorithm, a tree is detected at a distance less than 1m, it is likely to be due to the presence of obstacle in the path. This is also assigned the linguistic variable zero. x < 1m : Zero (Eq. 3-29) Suppose for example, that the values of x at an instant are x(left vision) = 1.5m and x(right vision) = 2.5m x(left ladar) = 1.5m and x(right ladar) = 1.5m Normalization of the linguistic variables was performed at a scale from 0 to 1. Therefore, the normalized values are x(left vision,normal) = [1,reasonable] x(right vision,normal) = [0.5,reasonable] and [0.5, unreasonable] x(left ladar,normal) = [1,reasonable] x(right ladar,normal) = [1,reasonable] This gives the normalized possible values for each variable after fuzzification. Here we have 5 possible values. In general, there could be examples where we could have a minimum of 4 possible values or a maximum of 8 possible values. It is complex to specify rules for all these possibilities. It is also not necessary to analyze all these possibilities separately. If a sensor performance is degraded, then it should degrade the measurements for both left and right side measurements. In the above example, x(right vision,normal) has 50% chance of being reasonable or unreasonable and x(left vision,normal) is 100% reasonable. In such a case, we make a premise to group the possibilities for both reasonable and unreasonable cases and 118

PAGE 119

reduce the linguistic variable for each sensor as a whole. That is, combining the left and right variables for vision, we have two possibilities for reasonable (1 and 0.5) and one possibility for unreasonable (0.5). If we combine the possibilities for reasonable and choose the minimum value for reasonable, we get x(vision, normal) = [0.5,unreasonable] and [0.5, reasonable] If two unreasonable values were present, the higher value of those would be chosen as it is better to consider the system to be erroneous rather than accurate. Similarly for ladar, we have x(ladar, normal) = [1,reasonable] We are now left with 3 possibilities. By performing this operation of grouping the possibilities for each sensor, we are always guaranteed to obtain only a maximum of 4 linguistic variables for the four distances. These values are used with the rule set. Table 3-4. Fuzzy logic rule set for sensor supervisor Vision Left/Right Ladar Left/Right Decision Reasonable Both Reasonable/Zero Ladar higher Zero Stop Unreasonable Vision Unreasonable/Zero Ladar Reasonable Reasonable/Unreasonable Vision higher Unreasonable Ladar Reasonable/ Zero Ladar higher Zero Ladar Unreasonable/ Zero Ladar Reasonable/Unreasonable Reasonable Both 119

PAGE 120

Rule base. The rule set for the supervisor is shown in Table 3-4. Based on the linguistic variables chosen from vision and ladar, the corresponding rules are fired. For our example, the rule corresponding to vision-unreasonable and ladar-reasonable is first fired and the decision is to use ladar. The second rule of vision-reasonable and ladar-reasonable is also fired and the decision is to use both. From the rules fired, the output membership function is chosen. The membership value is used to find the crisp value for the output. The output membership function decision relates the rules fired to the linguistic variables vision, ladar and both. The output membership function is shown in the Figure 3-59. These variables represent the decision made as to whether vision or ladar is given higher priority or if both should be given equal priority. As an exception, during the presence of an obstacle, the ladar takes precedence till the obstacle is avoided. The linguistic answers to the problem are not useful to implement on a computer. So a numerical answer is required. For that, the crisp value of the decision is taken as the measurement noise covariance value for a sensor. Vision Ladar Both 1 0 Figure 3-59. Crisp output membership function Premise quantification. For our example, for one rule, we obtained a decision of ladar with the values 0.5 and 1 and for another rule, a decision of both sensors with values 0.5 and 1. That is, we have two possibilities for each decision. We now have to determine which conclusion should be reached when the rules that are on are applied to the final decision making. To reduce the possibilities, we then use a premise of minimum to infer the decision. That is, 0 -3cm -1 2cm 0.2 Decision 120

PAGE 121

Min(0.5, 1) of ladar = 0.5 for using ladar Min(0.5, 1) of both = 0.5 for using both This leaves just one possibility for each decision. The next step is to obtain the numerical value. Inference. Defuzzification was done using the center of gravity method. The center of gravity method is used to obtain a crisp value of output using the output membership function. The center of gravity method is given by Ucrisp = iiiiimmb (Eq. 3-30) where bi is the center of the output membership function where the decision reaches its peak value and is the area spanned in the decision function. For our example, one decision of using ladar has been obtained. Therefore, we have to raise the measurement noise for vision and so we use the output membership function corresponding to vision. For the decision of using both, we use the output membership function corresponding to both. im Vision Ladar Both 1 0.5 0 Figure 3-60. Illustration of inference for the example The Figure 3-60 shows how the crisp value is obtained. For the first decision of using ladar with the value of 0.5, the noise covariance value of vision in R has to be increased. The dashed line illustrates the value of 0.5 in the vision membership. Similarly, the dotted line indicates the value of 0.5 in the both membership. bi for vision is 1 (or -1) and for both is 0. The positive 0 -3cm -1 2cm 0.2 Decision 121

PAGE 122

or negative values only indicate whether the decision is vision or ladar. Only the absolute values are used to update R. is the area under the dashed line and dotted line for each. Therefore, im Ucrisp = (1)(1.375)+(0)(0.45) / (0.45+0.975) = 0.75 This crisp value is the new value of covariance for vision in determining the path error. Although the crisp value is positive, since the decision was between vision and both, the crisp value is used to update vision only. The range of crisp values for the decision was chosen by performing simulations and determining what values provide good results. Using the Fuzzy logic supervisor described above, the suitable sensor at each location is chosen and the updates are made in R. Divergence detection and fuzzy logic correction The use of a Kalman filter with fixed parameters has a drawback. Divergence of the estimates is a problem which is sometimes encountered with a Kalman filter, wherein the filter continually tries to fit a wrong process. Divergence occurs due to the inaccuracies in the covariance values specified in the model. Divergence may be attributed to system modeling errors, noise variances, ignored bias and computational round off errors (Fitzgerald, 1971). Often, a white noise model is assumed for the process, while implementing the Kalman filter. White noise models are not true models of the process for autonomous guidance applications. It is often used to simplify the implementation. If a white noise model is assumed for the process, divergence, if encountered, is difficult to correct. Mathematical methods such as altering the rank of the covariance matrix can be used to reduce divergence. Abdelnour et al. (1993) used fuzzy logic in detecting and correcting the divergence. Sasladek and Wang (1999) used fuzzy logic with an extended Kalman filter to tackle the problem of divergence for an autonomous ground vehicle. They presented simulation results of their approach. The extended Kalman filter 122

PAGE 123

reduced the position and velocity error when the filter diverged. The use of fuzzy logic also allowed a lower order state model to be used. For this research, fuzzy logic based method is used with Kalman filtering to overcome divergence, as it is a proven method for divergence correction in previous researches mentioned above. It was chosen over mathematical methods of correction, as it was felt that rigorous mathematical analysis of the filter and model was beyond the scope of this dissertation. A common method to detect divergence is using the innovation vector. The innovation vector z(k) (Klein,1999), is defined as the difference between the measured vector and the estimation of the measurement vector as given by the equation z(k)= z(k) H x(k) (Eq. 3-31) The innovation vector is also called residual. For an optimum Kalman filter without divergence, the innovation vector is a zero mean Gaussian white noise (Abdelnour et al., 1993). Therefore, the performance of the Kalman filter can be monitored, using the value of the innovation vector. A sample mean from 10 successive values of z was used for detecting divergence. This sample corresponds to a time interval of approximately 300 ms. At the vehicle operating speeds of 3.1 m/s, this sample sequence was expected to be sufficient. However, initially, while starting the vehicle in autonomous mode, the required samples may not be available. Simulation results for the chosen sample size were found to be satisfactory. The analysis of the variation in performance with different sample sizes was not performed. Deviation of z from zero by more than 5% was taken as the indication of reduction in performance of the filter leading towards divergence. Divergence can be corrected by updating the process noise covariance matrix Q, based on the innovation vector. Fuzzy logic (Passino and Yurkovich ,1998; Subramanian et al., 2005) was 123

PAGE 124

used to perform this update for two parameters of Q namely qX and q R. The fuzzy logic system is shown in the Figure 3-61. Figure 3-61. Fuzzy logic implementation to correct divergence The xc, xL and c parameters of the innovation vector z are the inputs to the fuzzy logic system. Among the various parameters used in the filter, these are the three parameters whose process noise is less precisely known as they depend entirely on the algorithm developed. The other two parameters of heading from IMU and the velocity are obtained directly from the sensors and so their process noise is not expected to change significantly during operation. The output of the fuzzy logic system are the updates to Q namely qx and q R which correspond to path error and required heading. Figure 3-62. Input membership function The Figure 3-62 shows the input membership function. The ranges of the input are from -15% to 15% of the values of the innovation vector change from zero. These maximum and minimum values were chosen by trial and error by performing simulation. The inputs are converted to linguistic variables negative and positive as shown. These linguistic variables indicate the nature of the input which is either negative or positive. Normalization of the z(Xc/ XL, c ) Negative Positive 0 1 -15% 15% 0% 5% -5% Fuzzifier Inference Engine Rule Base Defuzzifier ~Q ~ Z Z Xc q X ZXL q R Z c 124

PAGE 125

linguistic variables was performed at a scale from 0 to 1. A set of if-then rules were developed by experimentation. Each rule received a corresponding normalized membership value from the input membership function and thereby the linguistic variable. Based on the linguistic variables chosen from each z, the corresponding rules are fired. The rule set is shown in the Table 3-5. Table 3-5. Fuzzy logic rule set for divergence correction zX z c qX q R Negative Negative Zero Zero Negative Positive Zero Positive Positive Negative Positive Zero Positive Positive Positive Positive For example, consider the second row in the Table 3-5 where zX is negative and z c is positive, this indicates that the error in position has reduced compared to the previous time step but the vehicle is heading in the wrong direction, then the process noise for position qX is zero and the process noise for heading q R is positive. In the rule set, a single measurement value for position, zX is given instead of the two different values, z Xc and z XL This is because, the sensor supervisor described earlier, picks one of the sensors for use in the filter when the other is unreasonable. In the cases where one sensor is given higher priority, the position measurement of that sensor alone is used in the rule set and in the cases where both sensors are selected by the supervisor; the position measurement of ladar alone is used as it has lower measurement noise. This reduction was performed to reduce the complexity of the rule set. 125

PAGE 126

Positive Zero 1 0 X=0 q (X, R) X=4 =0 =0.1 Figure 3-63. Output membership function From the rules fired for a given set of data, the output membership function is chosen. The membership function is shown in the Figure 3-63. The membership value was used to find the crisp value for the output. Defuzzification was performed using the center of gravity method (Passino and Yurkovich, 1998). The crisp values of the process noise covariance are then updated in the process noise covariance matrix Q. The range of crisp values was chosen by trial and error by performing simulations and determining what values provide good results. For the parameters, q c and qV, to update these covariances, a linear relationship was followed such that q c and qV were increased proportional to zz' of C and v respectively. This was performed independent of the fuzzy logic correction. Sample simulation result of the sensor fusion method The developed fuzzy logic enhanced Kalman filter algorithm was simulated in MATLAB to check the functionality and accuracy. The simulations also aided in tuning the parameters of the filter to improve the performance. The data required for simulation was obtained by manually driving the vehicle through a citrus grove alleyway and collecting all the sensor information in a text file. The data was then imported in to MATLAB to run the simulation. Program was written to simulate the filter. For the simulation, the data from the sensors collected at each instant was input to the program one set at a time to mimic the time varying nature of the data. The output of the simulation was the path error of the vehicle obtained from the fusion method by simulation. 126

PAGE 127

This path error was compared with the path error calculated by the vision and ladar algorithms. All the three data were plotted. The Figure 3-64 shows one case of the simulated result of fusing the error obtained from vision and ladar algorithms. The error is the distance between the center of the vehicle in the path and the path center. Figure 3-64. Simulation result of fusion of error obtained from vision and ladar algorithms The Figure 3-64 consists of three different plots superimposed for comparison. The plots are of vision based path error, ladar based path error and simulated fusion based path error. For the clarity in comparison of the plots, only a section of the path from distance of 16.2m to 16.8m of travel is shown. It is observed that from 16.2m to 16.35m, only vision provides reasonable error and ladar does not. This may be attributed to the fact that ladar sweep at that location might have been in a gap between two consecutive trees. Hence, for the first 15cm distance, the fused data seems to have relied mainly on vision. For the 15.35m to 15.55m section, error from both vision and ladar algorithms seem reasonable. Hence, fusion takes both data into account. However, the fused data relies more on ladar information as the process noise is less for ladar. For the last 25cm of the section from 16.55m to 16.8m, both vision and ladar determine the same amount of error. Therefore, the simulation result presented here shows the improvement in path tracking using fusion based guidance as compared with vision or ladar based guidance. 127

PAGE 128

Headland Turning and Navigation The development of autonomous vehicle guidance systems using machine vision and laser radar for navigating citrus grove alleyways using independent sensors and using fusion of the sensors was discussed in previous sections. This section discusses the methods developed for navigating the headlands of citrus groves. Headland in citrus groves is the space available at the end of each tree row. This space is generally used for turning the vehicles after navigating the rows or alleyways. Headlands are characterized by a ground space of at least 4.25m after the last tree in each row. The ground is usually covered with grass. Beyond this space, there may be trees planted to act as boundaries of the grove or there could be more open ground or some combination of both. Complete navigation of a citrus grove requires the vehicle to turn at headlands in addition to navigating the alleyways. The vehicle may not perform any task on the citrus trees but the time spent in navigating the headland is important for the overall operation. Miller et al. (2004) have reported that the space required to turn at the headland and the time spent in turning, significantly affect the field efficiency. If a lot of space is used by the vehicle at the headland for turning, the useful tree space is reduced. This reduces the overall yield of the grove. If a lot of time is spent on turning the headland, the efficiency of operation of the autonomous vehicle is reduced. Therefore time and space are important factors for executing headland turning maneuvers. The vehicle used for developing the headland turning algorithms was the John Deere eGator. The sensors used for developing the headland navigation algorithms are the camera, the ladar, the IMU and the wheel encoder. The details of these sensors and their mounting location were described earlier. The camera is mounted looking down at an angle of 10 degrees with the horizontal and the ladar is mounted looking down at an angle of 15 degrees with the horizontal. The camera, ladar and the IMU are interfaced with the 2.4GHz CPU running Windows operating 128

PAGE 129

system. This is the same computer that was used on the tractor. All high level algorithms for processing guidance sensor information and guidance algorithms are implemented in this shoebox computer. The speed control and the steering control are implemented in the PC104 computer described earlier. The steering control system used in the autonomous steering of the tractor was the PID control, in which lateral position error of the vehicle was given as input to the controller and steering was suitably controlled to minimize the error. In alleyway navigation, the specification of the lateral position of the vehicle in the path can be described as error of the vehicle from the path center. For headland navigation, there is no specified path that the vehicle must take. Therefore it is not practical to specify a path error in all conditions. Also, the development of classical controllers such as PID, a mathematical model of the system is useful for tuning the controller parameters. For the tractor, this model was experimentally determined. Development of a similar model for the eGator is time consuming. Therefore, a new type of steering controller was explored which does not require error specification or mathematical model, which was the Pure Pursuit steering. Pure pursuit steering Pure Pursuit steering mechanism has been used for many years in the Carnegie Mellon NavLab autonomous vehicle projects and has been reported to be robust. The method is simple, intuitive, easy to implement and robust. It is analogous to human driving in that humans look a certain distance ahead of the vehicle and steer such that the vehicle would reach a point at the look-ahead distance. The Pure Pursuit method is similar to the human way of steering. Pure Pursuit is a method for geometrically calculating the arc necessary for getting a vehicle onto a path (Coulter, 1992). Consider the illustration of the vehicle with the X-Y coordinate system shown in the Figure 3-65. 129

PAGE 130

Y (p,q) Figure 3-65. Pure Pursuit steering method Let the origin of the coordinate system be the center of the vehicle. Let (p, q) be a point some distance ahead of the vehicle. Let an arc join the origin and the point (p, q). Let L be the length of the cord of the arc connecting the origin to the point (p, q) and let R be the radius of curvature of the arc. Let R be expressed as a sum of distance p and some distance D. R = D + p => D = R p (Eq. 3-32) L, p and q form a right triangle. Using Pythagoras theorem for a right triangle p2+q2= L2 (Eq. 3-33) q, R and D for a right triangle. Using Pythagoras theorem, q2 + D2 = R2 (Eq. 3-34) => q2 + (R-p)2 = R2 Simplifying this equation, R = pL22 = 2p + pq22 (Eq. 3-35) As this method is analogous to human driving, we can choose a predetermined look-ahead distance p. For this project, the look-ahead distance was chosen to be 2.89m from the front of R X L q D p Vehicle 130

PAGE 131

the vehicle. For alleyway navigation, the error of the vehicle from the path center is q at the look-ahead distance p For non-alleyway navigation, q can be the required lateral movement of the vehicle at the look-ahead distance. This value of q is obtained from the machine vision and ladar algorithms for path segmentation. Using these values of p and q, R can be calculated from the above equation. R is the required radius of curvature of the vehicle to get the vehicle to the required point. In the CAN based steering system of the e-Gator, steering has to be specified as steering angles and not as radius. Therefore, R has to be converted to steering angle. A calibration experiment was conducted to find a relationship between the radius of curvature and the steering angle. In the experiment, the eGator was taken to an open area. Steering angle was kept constant by sending CAN commands to the steering controller. The vehicle was moved forward by manually stepping on the accelerator. For a constant steering angle, the vehicle moved in a constant circle. Markers were placed on the ground at the inner wheel radius at different locations. The diameter of the circle was manually measured using a tape measure. The steering angle was varied to obtain different radius of curvature. The illustration of the experiment is shown in the Figure 3-66 and the experimental results are shown in the Table 3-6. Vehicle R Figure 3-66. Illustration of experiment for steering angle calibration with radius of curvature 131

PAGE 132

Table 3-6. Calibration of radius of curvature of vehicle with steering angle Steering Angle (Degrees) 10 15 20 25 30 Left 10.6 9.9 7.46 5.8 4.4 Radius (m) Right 7.9 6.4 5.33 4.6 3.8 Experiments were conducted for steering angles from 10 degree to 30 degrees at 5 degree intervals. The lowest angle chosen was 10 degrees because smaller angle had much larger radius that caused the vehicle to travel out of the experimental area. The maximum angle of 30 degrees is the physical limit of the vehicles steering system. Experiments were conducted for both left and right turns. The experimental values were plotted in the chart shown below. y = -0.33x + 14.232y = -0.2x + 9.60602468101205101520253035Steering Angle (Degrees)Radius of Curvature (m) Left Right Linear (Left) Linear (Right) Figure 3-67. Line fitting for calibrating radius of curvature of vehicle with steering angle 132

PAGE 133

From the chart, linear equations were fit to calibrate steering angle with radius of curvature. The equations are: Left turn: Radius = -0.33* Steering angle + 14.232 (Eq. 3-36) Right turn: Radius = -0.2 Steering angle + 9.606 (Eq. 3-37) Rearranging the terms, Left turn: Steering angle = -3.03* Radius 43.12 (Eq. 3-38) Right turn: Steering angle = -5 Radius 48.03 (Eq. 3-39) Using these calibration equations, the steering angle required for various radius of curvature was determined. The steering angles were sent to the steering controller. It is observed that the minimum radius of curvature of the vehicle was 4m. This minimum value limits the smallest turn that the vehicle can execute in the headlands. The flowchart of the Pure Pursuit steering algorithm is shown in the Figure 3-68. Figure 3-68. Flowchart of Pure Pursuit steering algorithm Use look-ahead distance and offset to calculate radius of curvature Use calibration equations to determine steering angle from radius of curvature Obtain required vehicle offset from vision or ladar Send steering angle commands to stee rin g co n t r o ll er 133

PAGE 134

Headland turning maneuvers As mentioned before, the a mount of time spent and the area required by the vehicle to turn at theg determining the approach of the heose headlands is important. A headland turning maneuver can be performed as a series of the following steps: 1) Determining the approach of the end of current row while navigating the alleyway. 2) Precisely establishing the end of the row. 3) Turning into the headland. 4) Turnininto the next row. 5) Getting to a position to navigate the next row. Determining the approach of the headland. The first step of adland while navigating the alleyway is performed using machine vision. Consider the Figure 3-69 which shows the image obtained from the camera when the vehicle was in the middle of the alleyway A) and the image obtained from the camera when the vehicle was clto the end of the alleyway B) and C). A B C Figure 3-69. Image obtained from the camera when the vehicle is navigating the alleyway 134

PAGE 135

The segmentation algorithm used originally for path determination for headland naviggorithm ation, segments the entire image for trees and pathway. When the segmentation alis run on the above images, the resulting images are as shown below. A B C Figure 3-70. Segmented images ented images that when the vehicle is close to the end of the path, n or nd wo fixed boxes, large/green and small It is observed from the segm the upper half of the segmented image is predominantly either white or black depending owhether trees are absent or present in the headland. For the case when the vehicle is in the middle of the alleyway, the upper half of the image is both black and white without any colbeing predominant. This difference in the upper half of the image for close to headland cases amiddle of alleyway cases is used in the vision algorithm. The headland determination vision algorithm uses t /red as shown in the Figure 3-71. 135

PAGE 136

A B C Figure 3-71. Vision algorithm example for headland detection The two angular lines starting from the lower corners of the image and extending to the corners of the lower box indicate the path boundaries as obtained from path segmentation algorithm. For the case in the middle of the alleyway, the smaller box falls on the pathway whereas the bigger box encompasses random objects including the headland (A of Figure 3-71). When the vehicle is close to the headland, the two boxes fall on headland objects (B and C of Figure 3-71). As observed in the images, the headland often consists of either trees or open space. The use of the boxes is shown in the segmented images below. A B 136

PAGE 137

C Figure 3-72. Use of boxes in the segmented images Close to headland, the pixels in the two boxes are both classified as either trees or pathway whereas in the middle of the alleyway case, the two boxes are classified differently. Based on this difference in the boxes, the algorithm determines the approach of the headland. By trial and error, it was found that using this vision based method of detecting the approach to the headland, detection of the headland is always determined when the vehicle is at a distance of approximately 29m to 31m from the headland. As observed this distance is not accurate and the variation is from row to row. This is however a good estimate of the distance to the headland. Once the approach of the headland has been detected by the vision algorithm, this distance is used to dead reckon the vehicle through rest of the alleyway till the headland has been reached. A flag is sent to the PC104 computer indicating that the headland has been detected. In the PC104 computer, dead reckoning is performed using the information from the encoder. The equation for the dead reckoning process is given as Dc = Dc + de (Eq. 3-40) where Dc is the current distance and de is the change in the distance measured by the encoder. Also, when the headland is detected using the machine vision algorithm, the priority of ladar in the sensor fusion process is increased and the priority for vision is reduced to zero. This 137

PAGE 138

is because, close to the headland, the image from the camera primarily consists of headland objects. This is also due to the long field of view of the camera. The vision algorithm development was based on segmenting trees and path. Close to headlands, this segmentation is not feasible as the alleyway trees are out of the cameras field of view. On the other hand, ladar with the short field of view is able to detect the last few trees in the alleyway and perform the navigation. Therefore, navigation is switched entirely to ladar. Precisely establishing the end of the row. As mentioned earlier, the vision based headland detection gives an estimate of where the headland is located and the dead reckoning allows the vehicle to get close to the headland. A midsize citrus tree with some leafy canopy has a width of a minimum of 2m. The estimated distance using vision followed by dead reckoning allows the vehicle to get to a position such that the last tree is to the side of the vehicle. Once the dead reckoning is completed and the vehicle has reached the estimated distance near the last tree, the vehicle is moved further forward while the ladar data is monitored. This progress is continued until the ladar algorithm does not detect any trees on either side. When this happens, the vehicle is assumed to have crossed the last tree. This ladar monitoring ensures that the vehicle crosses the last tree of the alleyway. Drawback of the vision based headland detection. In the vision based headland approach detection algorithm described earlier, the assumption was that the headland consists either of uniform headland trees or open space. The caveat with the vision based headland detection is that the headland is not always composed entirely of trees or open space. A sample image is shown in the Figure 3-73. 138

PAGE 139

Figure 3-73. Image where the vision based headland detection can fail In the above image, a single tree is present in the headland along with an artificial object. In such a case, the vision based algorithm fails to precisely detect the headland. Presence of random objects makes the vision algorithm less accurate at determining the distance to the headland. In the previously described method, the vision based headland detection with dead reckoning helped the vehicle reach the last tree of the alleyway. With the failure modes of the vision based detection, the estimate of the headland may not be enough to reach the last tree or may take the vehicle beyond the last tree. To overcome this problem, a ladar based algorithm is described as follows. Precisely establishing end of the row using sweeping ladar. The currently used mount angle of the ladar is such that it is able to detect only the closest tree to the vehicle. It was proposed that if the ladar is rotated through a range of angles, many more trees could be detected in front of the vehicle. The precise distance measuring capability of the ladar would help in establishing the end of the row. In this method, a digital servo motor (HiTec HSR-5995TG) was custom attached with the ladar. A custom designed power supply regulator was built to power the motor. A PIC microcontroller based serial communication interface was also custom built to interface with the motor. The motor-ladar mount is shown in the Figure 3-74. 139

PAGE 140

A B Ladar Ladar Motor Motor Figure 3-74. Ladar and motor assembly for sweeping. A) view 1. B) view 2. The motor was attached to the ladar body such that when the motor shaft rotated, the ladar was oscillated up and down about the horizontal plane. The ladar was swept from an angle of 15deg. above the horizontal to 30deg. below the horizontal at a speed of approx. 15deg/s (Figure 3-75). The range of angles was chosen such that the ladar sweep covered only the tree locations. Angles beyond 15 degrees above the horizontal resulted in the ladar sweeping over the trees and the angles beyond 30 degrees below the horizontal resulted in the ladar sweeping the ground. The angles were also limited by the range of the mechanism linking the motor to the ladar. The speed of rotation was a limit of the servo motor. 15 o 30 o Figure 3-75. Illustrati on of the ladar sweep 140

PAGE 141

As mentioned earlier, the ladar refresh rate is approximately 38Hz. From the motor sweep speed and the ladar refresh rate, data available per degree is approximately 1538 = 2.53 = 2 full sweeps per degree Therefore, a ladar scan is obtained every half a degree at this sweep rate of the motor. In this sweeping ladar algorithm, the motor is allowed to execute one sweep of the ladar from up to down. The entire data obtained from the ladar during this period is stored in an array. Using the calculations mentioned above, the total number of scans of the ladar is then matched with 0.5 degree positions of the sweep angle. One sweep of the ladar by the motor at 15deg/s takes approximately 5s. Synchronization of the ladar and sweep angle consumes a longer time. That is, if the motor was steeped to move to 0.5 degree increments and then allowed to acquire a full ladar scan at each angle, the time consumed to execute one full sweep would be much higher. The overall allocation of scans to sweep angles eliminates the need to synchronize the sweep angle positions with the ladar scan and thereby saves time. Also, it was experimentally observed that the error introduced due to 0.5 degree allocation of scans is not significant to warrant synchronization. As mentioned earlier, the data obtained from the ladar is in polar coordinates with the ladar being the origin. For this algorithm, it is intuitive to use data in Cartesian coordinates. Consider the coordinate system shown in the Figure 3-76. Let P be a data point at a radius R from the origin subtending angles m and n with the z and y axis respectively. To convert this to the Cartesian coordinate system, the following equations were used. Px = -R Cos( 2n ) Cos(m) (Eq. 3-41) 141

PAGE 142

Py = R Sin( 2n ) Cos(m) (Eq. 3-42) Pz = R Sin( 2n ) Sin(m) (Eq. 3-43) Figure 3-76. Coordinate systems (Px, Py, Pz) is the data point in the Cartesian coordinate system. The data obtained by the scan is then filtered to keep the relevant data. It is known that the lateral distance of trees in the row, from the ladar can range from a distance of 0.5m to a maximum of 5m. The ladar to the vehicles end is 0.5m. So, a tree limb is not expected to be closer than 0.5m during normal operation. Distances longer than 5m correspond to trees located in adjacent rows. Therefore, the filter was implemented such that any data beyond 5m and data less than 0.5m were removed. After filtering, a sample plot of the data in the Cartesian coordinates is shown in the Figure 3-77. R P(R,m,n) z m y n x o 142

PAGE 143

Figure 3-77. Swept ladar data in Cartesian coordinates It is observed from the figure that the data corresponding to different trees are clustered together. It is also observed that only four consecutive trees in front of the vehicle can be detected with confidence. Although the range of the ladar is 80m, the trees further away contribute very few data points to be clearly detected. An algorithm was developed to determine the tree locations from the data. The Figure 3-77 shows the data in 3-dimensional space. The search space for the algorithm, when analyzing data in 3-dimensions is large. This results in the consumption of a large amount of computer processing time. For real-time precise headland detection, this is significant. The aim is to locate the trees with respect to the vehicle in the x-y plane. The location of the data in the vertical plane is not important. Without loss of useful information, the 3-dimensional data is projected on to the horizontal plane. The projected data is shown in the Figure 3-78. 143

PAGE 144

Figure 3-78. 3-dimensional ladar data projected on the horizontal plane It should be noted that multiple points at a location with different z values are overlapped in the Figure 3-78. The data space consists of 1cm x1cm bins. To locate the location of the trees, a clustering algorithm is used. The clustering algorithm is as follows: Scan the data space using a 50cm x 50cm window in 50cm increments If the window in a location consists of more than 5 data points, note the location of the window The location indicates the presence of a tree or a part of a tree If two consecutive tree containing locations differ by more than 250cm, the difference is due to the presence of headland. It is then determined that the headland starts just after the last tree location prior to this gap. Based on the location of the trees in the alleyway, the exact location of the headland is determined. The algorithm and clustering ensure that the headland is 144

PAGE 145

found precisely and the determination is accurate to less than 50cm beyond the last tree in the alleyway. The result of clustering and headland determination is shown in the Figure 3-79. Headland object Headland Tree Clusters Figure 3-79. Tree clustering and headland determination Vision and ladar based headland determination. The vision based headland detection and sweeping ladar based establishment of headland were described above. As mentioned, the sweeping ladar process takes approximately 5s to execute a sweep. It is not practical to execute the sweep while navigating the entire row. Therefore, sweep is started only when the vision based algorithm detects the headland. Then the sweeping ladar algorithm takes over. The dead reckoning is no longer required. On further tests conducted in the grove using the above algorithms, it was also found that a stationary ladar mounted at an angle of 10 degree to the horizontal, is able to detect four trees similar to the sweeping ladar. This mounting angle increases the operating range of the ladar. This angle however, may not be suitable for close range obstacle detection. The stationary mount of the ladar also obtains less data compared to the sweeping ladar. This however does not prevent the precise determination of the headland and is as accurate as the sweeping ladar. The sweeping ladar is still a useful development for future operations in this project. Once the headland location is established, the vehicle speed is reduced to headland navigation speed. 145

PAGE 146

Navigating the headland and entering next row. From the location of the headland determined by the vision algorithm and the exact location determined by the sweeping ladar, the turning maneuvers are started when the last tree in the alleyway has been passed. Two turning maneuvers namely the u-turn and the switch back turn were developed. The determination of which maneuver is to be executed is made using the sweeping ladar algorithm. If an open space of more than 3m is available in the headland, the vehicle proceeds to do a u-turn and switch back turn otherwise. For the turning maneuvers, algorithms for steering similar to human operations were developed. For the development of the turning algorithms, the distance between the left and right tree stems in a row is assumed to be approximately 6m. The U-turn maneuver is implemented as follows: 1. When the vehicle has crossed the last tree, note the heading of the vehicle using the yaw angle from the IMU. 2. Turn the steering completely to one end such that the vehicle moves opposite to the direction of the required turn. This step is to make a wide u-turn. 3. Move forward for approximately 0.5m. 4. Turn the steering completely to the other end such that the vehicle turns in the required direction. Allow the vehicle to move forward. 5. Monitor the heading angle. Monitor for obstacles in front of the vehicle using the ladar. If an obstacle is detected, stop the vehicle. 6. When the vehicle has turned by more than 150 degrees, switch to alleyway navigation mode using vision and ladar. 7. The vehicle has now just entered the new row. Any error in entering the new row is corrected using the alleyway navigation algorithm. The switch-back turning maneuver is implemented as follows: 1. When the vehicle has crossed the last alleyway tree, note the heading of the vehicle using the yaw angle from the IMU. 2. Move forward a distance of 0.5m. This is done to avoid colliding with any long branches of the last alleyway tree. 146

PAGE 147

3. Turn the steering wheel to one end such that vehicle turns in the required direction. 4. Monitor the vehicles heading and monitor for obstacles in front of the vehicle using the ladar. 5. When the vehicle has turned by an angle of 90 degrees with respect to the initial heading angle before turning, stop the forward motion of the vehicle. 6. Turn the steering to center position and reverse the vehicle a distance of 0.5m. 7. Turn the steering to the end such that the vehicle turns in the required direction. 8. When the vehicle has turned by more than 150 degrees with respect to the initial heading angle before turning, switch the algorithm to alleyway navigation mode using vision and ladar. 9. Any lateral error in entering the new row is corrected using the alleyway navigation algorithm. It is to be noted that the turning maneuver algorithms are tuned for a grove with certain dimensions of alleyway width. It is operable in groves with dimensions that are different by approximately 1m. But the algorithm is not suitable for groves that differ more. It is difficult to navigate headlands using vision and ladar as headlands can consist of random objects and the vision and ladar could not be modified for all possibilities. During tests in a grove in Immokalee, Florida, it was found that the grove includes a large drainage pit on the ground at the end of each row. In such a case, the turning maneuvers failed as the vehicle ends in the pit. Therefore, more complex ladar based algorithms may be necessary to adapt the vehicle to such possibilities. Open Field Navigation and Obstacle Detection In the previous sections, methods were described that allowed the autonomous vehicle to accurately navigate the alleyways of a citrus grove, detect headlands and turn at the headlands using u-turn and switch-back turn maneuvers and navigate subsequent alleyways. With these capabilities, the autonomous vehicle possesses the ability to navigate an entire grove autonomously. In large citrus producing farms, the groves can be arranged in the form of blocks separated by open fields. With the previously described capabilities, after the vehicle completes 147

PAGE 148

navigating a grove block, the vehicle has to be manually driven to another grove block. The vehicle would then autonomously navigate the new grove block. If the vehicle could autonomously navigate the area between grove blocks, the entire farm could be covered without human intervention. When the vehicle navigates the citrus grove alleyways or open fields, obstacles such as animals, humans or broken branches can be expected. When obstacles are present, the vehicle must be capable of detecting them and stopping the vehicle. If possible, the vehicle should also navigate around the obstacle. Within the grove alleyway, there is not much space to navigate around obstacles, whereas it might be possible in the open fields. The development of obstacle avoidance capability is beyond the scope of this project. This chapter discusses the development of open field navigation capability to navigate the open field between grove blocks and the development of obstacle detection capability. Open field navigation Open field refers to the open area available between grove blocks. The open field is expected to contain uneven terrain, random placement of trees and fences, and presence of objects such as cows. The vision based navigation and ladar based navigation methods described earlier were developed such that they could segment trees from the ground and navigate between trees. In open fields, there is no fixed placement of objects. The random nature of objects in open fields or lack of objects makes it difficult to develop vision or ladar based algorithms for navigating open fields and reach a different grove block. A global solution is required to solve this problem. GPS is an excellent global method for navigating open fields. Therefore, GPS based navigation for traversing open fields is proposed. A differential GPS (DGPS) receiver was available in the laboratory from a previous research on determining the tractor dynamics and was described earlier. GPS based navigation 148

PAGE 149

has been widely used by several researches for guiding autonomous vehicles through crop rows. The DGPS receiver available has a manufacturer specified accuracy of approximately 12inches. The receiver is also capable of operation at 5Hz with a DGPS fix i.e. enough satellite signals and the local station correction signal is available for high accuracy operation. For navigation based on GPS, the path that the vehicle must take should be specified as way points apriori. Using these points, the vehicle, with the aid of the DGPS receiver, can follow the points to reach the next grove block. The way points for this purpose would be the latitude and longitude coordinates of way points obtained using the DGPS receiver. When operating autonomously, the guidance system must be capable of using the latitude and longitude information to steer the vehicle to the required way point. The important component of the algorithm consists of determining the steering angle of the vehicle to get the vehicle from the current location to a required location. The capability to steer to consecutive points can be implemented in the software. Consider the vehicle located at a point A with latitude latA and longitude lonA. Similarly, consider a way point B with coordinates latB and lonB. Suppose the vehicle is required to travel to B from A, it is necessary to determine the distance between the locations and the direction of location B with respect to location A. The direction can be specified with the geographic north-south direction as reference. To determine these, a navigation formula called Haversine formula, was used. The Haversine formula is an equation to determine great-circle distance between two points on a sphere. It is an adaptation of the more general spherical trigonometric formulas. To use the Haversine formula, considering the earth as a sphere, assumptions are important to obtain results with good accuracy. For example, when the points are situated at a large distance compared to the radius of curvature of the earth, the formula considering the earth as a sphere gives good 149

PAGE 150

accuracy; when the distance between the points is significantly small compared to the radius of the earth, the assumption of a flat earth can give accurate results and reduce computation; when the points are located on either side of hills or mountains, modifications have to be made to obtain accurate results. For this project, it is assumed that the way points would be located at a much smaller distance compared to the radius of the earth and no significant elevation different is present between the points. Therefore, the Haversine formula, modified for short distances was used. The Figure 3-80 shows the vehicle at A and a way point B where the vehicle must reach. N Way point B C d Theta m Vehicle A Figure 3-80. Illustration of vehicle navigating way points The vehicle is heading in a direction towards a random point C with a heading angle m with respect to the geographic north. The straight line distance required to reach B is d and a turn through an angle theta+m is required to reach B. The Haversine formula gives d and theta as d = R p (Eq. 3-44) where R = radius of curvature of earth = 6378140m 150

PAGE 151

p = 2 Tan-1 ( a )1(a ) (Eq. 3-45) a = Sin2( 2dlat ) + Cos(latA) Cos(latB) Sin2( 2dlon ) (Eq. 3-46) dlat = latB latA (Eq. 3-47) dlon = lonB lonA (Eq. 3-48) theta = mod [atan2 {Sin(dlon) Cos(latB), Cos(latA) Sin(latB) Sin(latA) Cos(latB) Cos(dlon)}, 2 ] (Eq. 3-49) theta is positive for directions east of the geographic north and increases anti-clockwise from 0 to 2 in radians. The direction theta is the heading from location A to location B. To steer the vehicle, it is also necessary to determine the current heading of the vehicle with respect to the geographic north. This information was obtained from the IMU mounted on the vehicle for other algorithms. The IMU gives the heading of the vehicle with respect to the geographic north as m. The vehicle should be steered such that the angle m is the same as the angle theta. It is to be noted that the DGPS receiver is operating at a rate of 5Hz and the IMU is operating at a rate of approximately 40Hz. An assumption in developing the algorithm is that the vehicle is operating at a speed of less than 4m/s. This speed limitation is based on the 5Hz refresh rate of the DGPS receiver and was determined experimentally by trial and error. As the vehicle is steered in the proper direction, the distance between A and B also reduces until the vehicle reaches the location B. 151

PAGE 152

The steering angle obtained by the above formulas is in radians. In the development of the pure pursuit steering described earlier, a protocol was used such that the PC sent the error of the vehicle in pixel coordinates to the PC104 computer through serial communication and the error was converted to steering angle by the pure pursuit algorithm in the PC104 computer. Therefore, to use the same protocol, the steering angle obtained above has to be converted to pixel coordinates before sending to the Pure Pursuit steering controller. The physical range of the steering angles of the vehicle is approximately -30 degrees to +30 degree from left to right respectively with 0 degree as the center. From the above equations, the steering angle theta+m has a range of 0 to It should be noted that the mathematical calculations can provide a range of 0 to 2 Considering the examples that a turn of 2 is the same as 0 and for example a turn of 1.5 is the same as turn of 0.5 in the opposite direction, the effective range can be reduced to 0 to for steering. In degrees, this is the same as 0 to 180 degrees. This can also be expressed as -90 degree to +90 degree. Comparing the vehicles physical steering angle range and the required steering angle range, the conversion is from 0 to 180 degrees with 90 as center to 0 to 60 degrees with 30 as center (Figure 3-81). A linear relationship is formed between the ranges such that -30 to +30 degrees of the required vehicle turn is mapped to -30 degrees to +30 degrees of the physical steering angle of the vehicle. If this one to one relationship is made, the range from -31 degrees to -90 degrees if the vehicles turn and +31 degrees to +90 degrees of the vehicles turn should also be mapped. The range of -90 degree to -31degree of the vehicle turn is mapped to a single value of -30 152

PAGE 153

degree of steering angle and the range +31degree to +90degree of the vehicle turn is mapped to a single value of +30 degrees of steering angle (Figure 3-81). 0 0 + 30 30 Turn of steering 90 + 90 Turn of vehicle Figure 3-81. Angle ranges of steering and vehicle turn From the previous chapters, it is known that the image width used for vision is 400 pixels. To maintain the communication protocol with the PC104 computer algorithm, the range of -30 degrees to +30 degrees is mapped to 0 to 400 pixels of image coordinates. Pixel value = 400 (steering angle/60) (Eq. 3-50) This pixel value of the required steering is sent to the Pure Pursuit steering control. As described in the algorithm above, when suitable steering is executed, the distance between the vehicle and the way point reduces and the vehicle reaches the way point. It is known that the accuracy of the DGPS receiver is 12 inches. During initial developments of the algorithm, the vehicle was determined to have reached the way point, when the distance between the vehicle and the way point was zero. It was observed that the uncertainties in the accuracy of the receiver, the terrain variation and small inaccuracies in steering often result in the vehicle reaching a point very close to the way point of the order of centimeters away from the way point. 153

PAGE 154

In such cases, if the algorithm requires that the distance be zero, to translate the vehicle by a few centimeters to the way point, the vehicle has to execute circles or spirals to get to the way point (Figure 3-82). Path Vehicle Way Point Translate Figure 3-82. Path taken for translation of the vehicle to the way point An accuracy of exactly reaching the way point is not practical in the physical system. Therefore, by trial and error in tests, it was chosen that if the vehicle reaches a point within a distance of 50cm from the way point, the way point is determined to have been reached by the vehicle. Obstacle detection The obstacle detection capability should allow the vehicle to detect obstacles such as humans, animals and location variable objects such as vehicles, while navigating open fields. For the scope of this dissertation, a simple obstacle detection method using ladar was developed. It is assumed that the way points for GPS based navigation were chosen such that the vehicle would not be expected to get tree limbs or fixed objects such as thin poles in its path. For the development of the algorithm for detecting obstacles, the first step was to analyze the data from the ladar while navigating the open field. As mentioned earlier, the ladar scans a 154

PAGE 155

plane from 0 degree to 180 degrees and obtains radial distance to objects in front of the ladar. The radial data is converted to Cartesian coordinates using the geometrical formula Frontal distance = (radial distance)(Sin (scan angle)) (Eq. 3-51) Lateral distance = (radial distance)(Cos (scan angle)) (Eq. 3-52) The same formula was also used in the algorithm for ladar based grove alleyway navigation. The Figure 3-83 shows the plot of the frontal distance obtained from the ladar against the scan angle when there is no obstacle in front of the vehicle. 050010001500200025003000020406080100120140160180200Scan Angle (deg.)Radial Distance (cm) Ground Figure 3-83. Ladar data when obstacle is not present in front of the vehicle From the chart, it is seen that the closest distance in front of the vehicle at 90 degree angle corresponds to the ground directly in front of the vehicle. The ground corresponds to a distance of approximately 6m frontal distance from the ladar. For scan angles away from the central angle of 90 degrees, the distance corresponds to ground locations away from the center of the vehicle. The Figure 3-84 shows the scan data from the ladar, when an obstacle is present directly in front of the vehicle. From the data, it is observed that for most of the angles, the ladar data is identical and this corresponds to open field. However, at scan angles of 85 degrees to 95 degrees, 155

PAGE 156

the frontal distance drops down to approximately 450cm from the ground data which was approximately 6m in the previous scan shown above. This large change in data corresponds to the presence of an obstacle in front of the vehicle. This can also be interpreted as the obstacle cutting the laser to the ground, thereby showing a smaller distance (Figure 3-85). 050010001500200025003000020406080100120140160180200Scan Angle (deg.)Radial Distance (cm) Obstacle Figure 3-84. Ladar scan data with obstacle present in front of the vehicle No Obstacle Obstacle 4m 6m Obstacle Vehicle Vehicle Ground Figure 3-85. Illustration of the obstacle detection using ladar The obstacle detection algorithm is based on the above analysis of data described above. The algorithm is as follows: Prior to operation in autonomous mode, when the vehicle is stationary, obtain the ladar scan radial distance corresponding to 90 degree. Determine this distance as the ground distance from the ladar. 156

PAGE 157

Operate the vehicle in autonomous open field navigation mode. Monitor the ladar data continuously and note sudden drops of more than 50cm from the ground distance. If a drop is encountered at an angle, slide a data window of length 5 data points through the next 5 subsequent data. If all of these data are also less than 50cm from the ground distance, determine the presence of an obstacle. If an obstacle is determined, stop the vehicle. Continue monitoring the scan data for the presence of obstacle. If no obstacle is found, continue navigating using GPS based navigation. The stop signal is sent to the PC104 computer as a special stop command to reduce the speed to zero. No change is made to the steering control. The algorithm uses a data window of 5 data points to scan through the ladar data set to detect obstacles. This corresponds to an angle of 3 degrees. It was determined experimentally by trial and error that an angle of 3 degrees translates to sufficient width to determine the presence of an obstacle such as a human. Wider obstacles such as cows are expected to be determined. The threshold distance of 50cm was chosen so that small terrain elevation changes are not detected as obstacles. The angular width of 3 degrees may not be sufficient to detect objects such as poles. This however is useful in preventing the detection of tall grass as obstacle. The flowchart of the algorithm is shown in the Figure 3-86. 157

PAGE 158

Start the vehicle in manual mode Obtain the data corresponding to 90 deg r ees as g r ou n d d i sta n ce Operate vehicle in autonomous mode and monitor ladar data Is any data <50cm from ground distance? No Is 5 consecutive data <50cm from ground distance? Determine obstacle. Sto p vehicle. Figure 3-86. Flowchart of the obstacle detection algorithm 158

PAGE 159

CHAPTER 4 SOFTWARE DEVELOPMENT Introduction This chapter discusses the software development part of the research. The architecture of the software and the interaction between the computers is discussed. The high level programs implementing the guidance algorithms were written in the Microsoft Visual Studio.Net 2003 environment using the Microsoft Foundation Class (MFC) for the Graphical User Interface (GUI). The environment of the low level software will be discussed in the respective sections. Sections of the code relevant to the discussion are presented in the appendices and are referenced in this chapter. Software Architecture for Tractor Guidance The hardware architecture and the implementation of the guidance system developed for the tractor was discussed in Chapter 3. The high level algorithms, primarily machine vision based path segmentation and ladar based path segmentation were implemented in C++ in the Windows based PC (shoebox). The low level control algorithm, that is, the PID based steering control was implemented in the microcontroller. The software architecture of the tractor based guidance system is shown in the Figure 4-1. In addition to the vision and ladar based algorithms in the shoebox computer, other software for serial communication, camera control capability and GUI were also implemented. The high level programs from the shoebox communicated with the low level control in the microcontroller using RS232 serial communication. The vehicles lateral error in the path was communicated to the low level controller for steering. The low level controller did not have an exclusive display monitor. Therefore the data was communicated to the shoebox using RS-232 for display in the monitor and for the operator to monitor the operation of the steering controller. 159

PAGE 160

Serial Communication/ Figure 4-1. Software architecture of the tractor guidance system The main application software was started up using dialog box based control in the shoebox computer. When the software is started, a dialog box window opens as shown in Figure 4-2. Figure 4-2. Dialog box for high level controls The dialog box consists of a large window for displaying the video seen using the camera. There is a menu bar on the top left of the window for accessing the various sensors. The menu for each sensor includes an additional level of menu for calling each function controlling the individual sensor. Using the dialog box, each required sensor is started up. The guidance Vision or Ladar based path segmentation PID based steering control RS 232 RS 232 Path Error Dis p la y Camera control Shoebox Computer Microcontrolle r Menu Video Display 160

PAGE 161

algorithm starts sending the path error information to the microcontroller as soon as either vision or ladar algorithm is started up in the dialog box menu. The low level control program in the microcontroller has to be started using the custom C++ compiler provided with the microcontroller. As soon as the low level controller is turned on, it starts to poll for error from the PC and send steer commands to the electro-hydraulic valve. The voltage line between the microcontroller and the electro-hydraulic valve also has a hardware switch. It should be noted that the tractor is switched to autonomous mode only when all the required algorithms have been started up. Stopping the programs is also performed using the dialog box. PID Based Steering Control The development of the PID based steering control system was discussed in Chapter 3. The steering control was implemented in the Tern Microcontroller. The software used was the custom C++ development environment provided with the microcontroller. The microcontroller was interfaced with the PC through RS-232 serial communication. The input received by the PID controller was the lateral position error of the vehicle in the path. The output of the controller is the voltage sent from the Analog to digital converter in the microcontroller. This voltage in turn is sent to the electro-hydraulic valve through the amplifier. The important sections of this code are presented in Appendix A. The flow of the program is as follows: 1. The microcontroller is first initialized. Then, the initialization is performed for the data, the decoder, the serial communication from the PC, the analog output from the Analog to digital card, and the PID control variables. The steering is also centered initially. 2. An infinite while loop is started. The exit from the while loop is based on receiving serial command from the PC. 3. The initial decoder value is obtained corresponding to the center of the steering. 4. Serial data is obtained from the high level vision or ladar algorithms running in the PC. The data is obtained in the form of four consecutive characters along with a start character. 161

PAGE 162

5. Once the correct start character is obtained, the other 4 characters are analyzed. The first character provides information about the turn being left or right. 6. The three subsequent characters specify the error information. The three characters are the multipliers to the 100th place, the 10th place and the 1 place. If the characters obtained are a,b, and c, then the error is 100a+10b+c. Once this reconstruction is performed, the PID control is started. 7. In the PID control, the error obtained is used to calculate the derivative of the error and the integral of the error. 8. From these calculations, the gain is calculated as PID gain G = Kp(error) + Kd(change in error)+Ki(integral of the error) (Eq. 4-1) where Kp, Kd and Ki are the individual gains mentioned in Chapter 3. 9. This overall gain G is used to obtain the required steering angle change using an experimentally determined formula to convert the gain to required change in encoder pulses as Change in pulses = (0.5) Gain (7) (Eq. 4-2) Where, (7) corresponds to the encoder calibration that 7 pulses correspond to 1 degree turn of steering. 10. This required turn of the steering in the form of encoder pulses is used to turn the steering. For the steering, the required turn angle is converted to the required voltage to execute the turn at a reasonable speed. 11. The voltage value is sent to the analog to digital converter of the microcontroller. 12. Apart from these calculations, saturation limits are used to limit the voltages and steering to physical limitations of the hardware. 13. When the navigation is complete, the steering is centered and the execution is terminated. The manual control of steering is again available. Vision and Ladar Based Path Segmentation As mentioned earlier, the vision and ladar based path segmentation algorithms are implemented in the shoebox computer. The important code segments for these are presented in Appendix B and Appendix C. In the initial stages of the project, when the vision and ladar sensors were used independently for navigation, only one of these codes was used for navigation at a given time. This choice was made by the human operator as to which function was called. 162

PAGE 163

The vision algorithm is implemented in the function Imagproc(). The function is called from the OnPaint function of Windows. The OnPaint is a screen refresh function. The vision algorithm is repeatedly called as often as Windows tries to refresh the screen. Therefore, the refresh rate of the image in the dialog box is dependent on the speed of the vision algorithm. The input to the vision algorithm is the pointer to the memory location where the image is stored. The output is the lateral error of the vehicle in the path. The vision program is implemented as follows: 1. The imagproc() function is called. The required variables are initialized. 2. Thresholding is performed for each pixel of the image using the specified threshold values. Binary image is formed. 3. Regions of the image are scanned using a data window and determination is made as to whether the image is of a case with lot of shadows or not. If it is a non-shadow image, thresholding is performed again using new threshold values. 4. Morphological operations are performed to segment the image further and remove salt and pepper noise left from thresholding. 5. The image is scanned row by row to determine the location of the boundary of path and non-path regions. The pixel locations are stored in an array. These locations correspond to the path boundaries. 6. This array is sent to the line fitting function GetBoundary(left) and GetBoundary(right) to fit lines for the path boundaries. 7. The lines are used to determine the center of the path GetBoundary(middle). 8. The required path to be taken is painted in the image. 9. Using the center of the path and using the center of the image as the vehicle center, the lateral error of the vehicle in the path is determined. This error is in pixel coordinates. 10. The error is sent to the PID based steering controller. The ladar based path segmentation is implemented using the function Ladar_Guidance(). The input to the function is the pointer to the memory location where the ladar data is stored. The output is the vehicle lateral error in pixel coordinates. The function is called from the function used to obtain the ladar data. Therefore, the Ladar_Guidance function is called as often as the 163

PAGE 164

ladar data is obtained and the ladar algorithm is run in a sequence. The implementation is as follows: 1. The ladar data in polar coordinates is converted to Cartesian coordinates to determine lateral and frontal distance to different objects. Only frontal distance is used to determine the path. The data is an array of 360 numbers, i.e., 0 to 180 degrees at 0.5 degree increments. 2. The vehicle is initially stationary. The distance value at the 180th data or 90 degrees is saved as the distance of the ladar to the ground. 3. The frontal distance data is scanned from 0 to 180 degrees. If a data is less than 50cm from the ground distance, it is determined as an object with some height(~50cm) above the ground. In the grove alleyway, an object of that height is determined as a part of a tree. 4. Five successive points are then confirmed to possess an equal or greater height to ensure that the originally determined point was not due to noise or stray grass. 5. The location of the closest tree point is determined on the left and right side of the vehicles center. The lateral distance corresponding to these locations is determined. 6. If the lateral distance is over 3m, it is determined to be a tree from a subsequent alleyway. 7. From the lateral distances of these tree points on the left and right side of the vehicle, the required center point of the path is determined. The lateral distance to this center point is the error of the vehicle in the path. 8. The error is sent to the PID based steering controller. As the steering controller takes pixel coordinate error as input, the error determined from the ladar is converted to pixel coordinates using a formula Lateral error in pixels = 200-0.74*lateral_error (Eq. 4-3) This formula is experimentally determined from trial and error to correspond to an equivalent pixel error. 9. If trees are present only on one side of the vehicle, path center is determined at a fixed distance from the tree on one side. If trees are not present on either side, zero error is reported till a tree is encountered. Sensor Fusion Based Navigation The development of the Kalman filter based sensor fusion method was described in the Chapter 3. The code segment for the Kalman filter implementation is presented in Appendix D. The sensor fusion is implemented in the function Fusion(). The inputs are the measurement 164

PAGE 165

values from vision, ladar, yaw from the IMU and the speed from the speed sensor. The output of the function is the fused error in pixel coordinates. The variables are intuitively named to be similar to the variable names used in the algorithm development. The function is called from the function OnPaint() when both vision and ladar or at least one of them is operational. The fuzzy logic supervisor and divergence detector as implemented as a set of if then loops identical to the explanation in the Chapter 3. The function is implemented as follows: 1. The variables are initialized. The measurement values for each time step are taken as input to the measurement vector. The matrices are implemented as two dimensional vectors. The vectors are implemented as arrays. If this function is called for the first time, the yaw angle is taken as the initial heading angle of the vehicle. The process and measurement noise covariance matrices are initialized. 2. The vision and ladar data are checked for the range. If the range is unreasonable, the measurement noise covariance value is increased. This is performed using fuzzy logic if then loops similar to the description in the Chapter 3. 3. The time update equations 3-17 and 3-18 are first executed to compute the state estimate and the state estimate error covariance matrix. 4. The measurement update equation 3-20 is executed to compute the gain. The equations 3-21 and 3-22 are then executed to compute the posterior estimate of the state and the posterior estimate of the state estimate error covariance matrix. 5. The fused error is obtained from the posterior estimate of the state. This error in pixel coordinates is sent to the low level controller for steering. 6. The innovation vector is computed using equation 3-31. This is used as the input for detecting divergence. 7. The process noise covariance matrix is updated if divergence is detected. Large amount of matrix memory is used for this implementation. Therefore, the memory allocation is made before starting the fusion routine and deleted before exiting the program. Software Architecture for eGator Guidance The hardware architecture of the guidance system for the eGator was presented in Chapter 3. The high level algorithms are implemented in a PC running Windows XP operating system. 165

PAGE 166

The low level algorithms are implemented in a PC104 based computer using Linux (SUSE 10) operating system. Windows Linux Vision Ladar Kalman Filter Sensor Fusion Headland Navigation RS232 Error Dead Reckoning Pure Pursuit Steering Speed Control Speed E rror Analog Out CA N Figure 4-3. Software architecture of the eGator g u idance sys t em The software arch itecture of the guid a nce syst em used with th e eGat or is shown in the Figure 4-3. The lateral e rror of the vehicle and the required speed of the vehicle is sent from the high level com puter to the low level com puter using RS232 serial communication. T h e low level contro lle r o u tputs th e steeri ng angle as CAN commands and the requ ired command for the analog to d i gita l card to output th e v o ltage to co ntrol sp eed. The low lev e l controller also m onitors a s e ria l por t f o r inf o rm ation f r om the vehicles wheel encoder to perform dea d reckoning and speed control. The high level computer and the soft ware is the sam e as the one used with the tractor. Therefor e, the high level software is id entical in operation using dialog box based contro l. The low le vel sof t w a re is im plem ented in C ++ using the G N U C ++ com p iler. The low level so f t w a re exec ution is sta r ted f r om th e com m a nd prom pt. The low level sof t w a re is then turned off using the com m a nds from the high level controller. For m onitoring the a c tiv ities of the low level software, inform ationa l text strings are printed in the term inal window. A hardware switch is included between the low level com puter and the drive m o tor controller. As soon as the low level so f t w a re is sta r ted, it polls the ser i a l po rt f o r er ro r infor m ation from the high level 166

PAGE 167

computer. Therefore, the hardware switch should be turned to autonomous mode only when both the high level and low level software are operational. Pure Pursuit Steering The algorithm for the Pure Pursuit steering was presented in Chapter 3. The code segment for the software implementation is presented in Appendix E. The algorithm is operational as soon as the low level software is started. The input to the algorithm is the lateral error in pixel coordinates from the high level controller. The output of the algorithm is the required steering angle. The software implementation is as follows: 1. The lateral error of the vehicle is obtained in pixel coordinates. 2. The error is then determined to correspond to either a left turn or a right turn. 3. The error in pixel coordinates is converted to distance coordinates. The look-ahead distance is chosen as 6m. At 6m, a calibration was made that 180 pixels in the image corresponds to 2.43m of lateral distance. This calibration is used to convert error in pixels to meters. 4. The error is used to compute the required radius of curvature of the vehicle using the equation 3-35. 5. The steering angle is computed from the radius of curvature using equation 3-38 and 3-39. 6. The steering angle is sent to the CAN controller using CAN commands. Headland Turning The development of headland turning algorithms is discussed in Chapter 3. The algorithms are implemented in the PC in the Visual C++ environment. The important code segments are presented in Appendix F. This software is implemented as functions called from the vision based alleyway navigation function, when the headland approach is determined using vision. When approaching the headland, first the sweeping ladar algorithm is called to determine the exact location of the headland. The sweeping ladar algorithm is implemented in the function treedladarprocess(). The steps implemented in the function are as follows: 167

PAGE 168

1. The ladar data is converted from 3d polar coordinates to 3d rectangular coordinates using equations 3.41, 3.42 and 3.43. 2. The relevant data is cropped to eliminate unreasonable distances in the x,y, and z directions. 3. To reduce computation time, the data in the 3d Cartesian space is projected on the 2d Cartesian plane. 4. The data in the 2d is treated similar to an image to determine the locations of the trees. 5. From the location of the trees, the headland location is precisely determined. When the vehicle is at the end of the alleyway, the turning functions are called. These are implemented in two functions HeadlandU() for the u-turn and Headland3pt() for the switch-back turn. The choice of whether u-turn or switch-back turn is performed is specified in the program by the user. The input to the two functions is the yaw angle from the IMU and the output is the steering command to the low level controller specified as error in pixel coordinates. The implementation of the turns is based on the algorithm development presented in Chapter 3. The u-turn is implemented as follows: 1. The machine vision algorithm is turned off during the headland turn. Initialization of variables is performed. The angle at which the vehicle exits the alleyway is saved. The turn will be performed such that the vehicle turn is referenced to this saved angle. 2. Corrections are then calculated. This can be explained using an example. It is to be noted that the IMU outputs yaw angles with a range of 0 to 360 degrees in the clockwise direction. If the vehicle exits the alleyway at an angle of say 30 degrees and if an anti-clockwise u-turn is to be performed, the final angle output angle from the IMU after the turn is performed would be 210 degrees, whereas the mathematical calculation in software would be 30-180 or -150 degrees. To avoid this difference due to the cyclic nature of the data, the appropriate corrections are calculated. 3. If a left u-turn is required, as soon as the vehicle exits the alleyway, a right turn is made until the vehicle turns by a specified angle. Note that the vehicle also moves forward while turning. This turn in the opposite direction is done to execute a wide u-turn. 4. Then, the left turn command is given. The yaw angle is continuously monitored. When the yaw angle is close to 180 degree with reference to the original saved angle as reference, the control is switched to vision and ladar based alleyway navigation functions. The alleyway navigation functions make the vehicle enter the alleyway and continue navigating the next alleyway. 168

PAGE 169

The switch-back turn is implemented similar to the u-turn function. The difference is that the initial turn in the opposite direction is not made and after the vehicle has turned by 90 degrees with reference to the initial saved angle, the vehicle is reversed for a specified distance before the turn is continued. The Gator.Write command is for sending the data to the low level controller using serial communication. Open Field Navigation and Obstacle Detection The development of the open field navigation and obstacle detection algorithms were discussed in the Chapter 3. The algorithms are implemented in the PC in the Visual C++ environment. The open field navigation uses a GPS based navigation algorithm for steering the vehicle whereas the obstacle detection algorithm is based on the ladar and is used to stop the vehicle when an obstacle is present. The important sections of the code are presented in Appendix G. Open Field Navigation In the GPS based navigation, the algorithm determines how much the vehicle is to be steered to reach a way point. The Pure Pursuit steering discussed earlier uses error information in pixel coordinates to compute the steering angle of the vehicle. Therefore, the steering determined by the GPS based navigation algorithm converts the required steering to error information and passes on to the Pure Pursuit steering algorithm in the PC104 computer. The software is implemented as follows: 1. The way point file is opened and read to acquire the latitude and longitude coordinates of the way points. The values are stored in an array. The file is then closed. This is implemented in the function ReadGpsCoord(). 2. To navigate the way points, the function GPSnavigate() is called. The function includes a loop which runs till all the way points have been reached. 3. If all the points have not been reached, the distance to the way point, GPSdistance, and the required heading of the vehicle GPSbearing are first calculated. 169

PAGE 170

4. The IMU is mounted such that the device points towards the geographic south. As mentioned earlier, the GPS based navigation algorithm provides the heading with reference to the geographic north. Therefore to use the same reference for both, the required heading calculated using the GPS coordinates is shifted by 180 degrees. 5. The heading required of the vehicle, calculated using the GPS coordinates, and the current heading of the vehicle are compared to determine the direction of steering and the amount of steering. This is implemented in a series of if-then conditions. 6. The calculated steering angle is then converted to error in pixel coordinates. The data is filtered using median filter to eliminate noise. 7. The pixel error is sent to the PC104 computer running the Pure Pursuit steering algorithm. Obstacle Detection In the obstacle detection algorithm, whenever an obstacle is detected, a unique command is sent to the PC104 computer to stop the vehicle. The obstacle detection algorithm is implemented as follows: 1. The obstacle detection is implemented in the function LadarObstacleDetection(). Initializations are first performed. 2. The ladar data is obtained in the form of scan angles and radial distance SIC.MVA[i]. The radial distance is then converted to frontal distance. 3. When the vehicle is stationary initially, the frontal distance corresponding to the 90 degree scan angle is saved as the distance to the ground in front of the vehicle. 4. The data is scanned from 70 degrees to 110 degrees to look for data points less than 50cm from the ground distance. If such a data point is found and if five consecutive points are also less than 50cm from the ground distance, then the presence of an obstacle is determined. The range from 70 degrees to 110 degrees spans 40 degrees and covers the area just in front of the vehicle. 5. If an obstacle is found, stop command is sent to the PC104 computer to reduce the speed to zero. 6. The function is called repeatedly as long as the GPS based navigation is operational. Therefore, if the obstacle is removed, the stop signal is not sent and the vehicle continues to move forward. 170

PAGE 171

CHAPTER 5 DEVELOPMENT OF MACHINE VISION AND LASER RADAR BASED AUTONOMOUS VEHICLE GUIDANCE SYSTEMS FOR CITRUS GROVE NAVIGATION Introduction Florida supplies over 80 percent of United States supply of citrus. Over the years, the labor supply for citrus harvesting has declined. Immigration restrictions have limited the movement of migrant workers. The citrus industry is also facing increased competition from overseas markets. As a result, the need for automation is being felt in the citrus industry. An important part of the automation process is vehicle guidance. Sample autonomous vehicle applications may include harvesting, disease or nutrient deficiency monitoring, mowing, spraying and other tasks. There are numerous autonomous and tele-operated vehicle systems described in the literature. Tele-operation has been used for guiding a HST drive vehicle by Murakami et al. (2004). The major challenges encountered in tele-operation are the time delay in communication and full time attention of a human. Yekutieli and Pegna (2002) used a sensing arm, to sense plants in the path, for guidance in a vineyard. However, using an arm would require citrus groves to be even with continuous canopy. There are additional concerns about damaging the tree branches. Ultrasonic sensors have been used for guidance in greenhouses, but they require that the target be perpendicular to the sensor for the ultrasonic waves to be reflected back properly (Burks et al., 2004). Dead reckoning is widely used in combination with other sensors for autonomous vehicles. Nagasaka et al. (2002) and Kodagoda et al. (2002) have used rotary encoders for detecting vehicle position. However wheel slip can cause significant error in distance measurements. 171

PAGE 172

GPS, in combination with inertial navigation systems, are used in other areas of agricultural vehicle guidance and precision agriculture. Both Real Time Kinematic (RTK) GPS and Differential GPS (DGPS) can be satisfactory and allow accurate real time measurement. Nagasaka et al. (2002), Benson et al. (2001) and Noguchi et al. (2002) have found that RTK GPS receivers give very accurate results. Ehsani et al. (2003) evaluated the dynamic accuracy of some low cost GPS receivers with the position information from RTK GPS as reference. They found that these receivers had an average absolute cross track error around 1m, when traveling in a straight line. GPS cannot be effectively used for positioning in citrus applications, since the vehicle frequently moves under tree canopy, which could block the satellite signals to the GPS receiver. Moreover, a system using GPS for guidance requires that a predetermined path be given for the vehicle to follow. Consequently, significant time has to be spent in mapping its path. Laser radar (ladar) has been used for ranging and obstacle avoidance. Carmer and Peterson (1996) discussed the use of laser radar (ladar) for various applications in robotics. Autonomous vehicle navigation was discussed as a promising application for ladar, due to its ability to accurately measure position. Gordon and Holmes (1998) developed a custom built ladar system to continuously monitor a moving vehicles position. The testing was done indoors. Ahamed et al. (2004) used a ladar for developing a positioning method using reflectors for infield road navigation. They tested differently shaped reflectors to determine the accuracy in positioning. Ladar has been used for navigating a small vehicle through an orchard (Tsubota et al., 2004). A guidance system using ladar was found to be more stable than using a GPS in a citrus orchard setting. In the research mentioned above, ladar was tested outdoors in the absence of high dust conditions. It is expected that the operation of ladar in dust, fog and similar conditions which block light, would result in reduced performance. 172

PAGE 173

Machine vision has been applied in mobile robots by numerous researchers. Its low cost and good performance has made it a good candidate for the main guidance sensor. Machine vision and ladar guidance system performance can be comparable (Subramanian et al., 2004). Misao (2001) used machine vision for an automatic steering system on a field sprayer. A video camera was used to acquire the image of the travel path with red targets. Image processing algorithms determined the distance from the current vehicle position to the target, and then the actual position to the desired vehicle path was compared and corrected by automatic steering control. Han et al. (2002) developed a row segmentation algorithm based on k-means clustering to segment crop rows. This information was then used to steer a tractor. The guided tractor was able to perform field cultivation in both straight and curved rows. Okamoto et al. (2002) developed an automatic guidance system for a weeding cultivator. A color CCD camera acquired the crop row images, and by processing the images in the computer, determined the offset between the machine and the target crop row. The weeding machine was then steered through the crop row using an electro-hydraulic steering controller. A good control system is necessary irrespective of the sensor used for guidance. Feedforward + PID control worked well for guiding a tractor through crop rows (Kodagoda et al., 2002). PID control has been used for guiding a grain harvesting tractor (Benson et al., 2003). In that research, PID was used to calculate the actuator command signal based on the heading offset. The performance of the controller was comparable to that of manual steering. Cho and Ki (1999) used fuzzy logic controller and machine vision for guiding an autonomous sprayer vehicle through orchards. The input information to the fuzzy logic controller was given by both machine vision and ultrasonic sensors. 173

PAGE 174

To date, there has been minimal success in developing commercial autonomous navigation systems for citrus grove applications. However, other similar applications have provided valuable insights for this research. The vehicle used in this research was a John Deere 6410 tractor. Machine vision and laser radar were used as the primary guidance sensors. PID control was selected for controlling the steering of the autonomous vehicle, because it has performed very well in previous mobile robotics research. The overall objective is to design a guidance system which will make the tractor capable of navigating through an alleyway of a citrus grove. Before testing in a citrus grove, it was decided to test the vehicle in a portable path constructed of hay bales. The following specific objectives were selected for this research: Develop electro-hydraulic steering circuit for the vehicle. Develop a mathematical model for the vehicles dynamics and design a PID control for steering control. Develop two algorithms for path finding, one using machine vision and the other using laser radar. Test the guidance systems performance in a test track and confirm guidance in a citrus grove. Materials and Methods A John Deere 6410 tractor was selected for this study. It is well suited for such environments and might be commonly used in spray applications. The width of the tractor was about 2.5 m. To accommodate electronic steering, the tractor was retrofitted with a Sauer Danfoss PVG 32 proportional servo valve. The valve was plumbed in parallel with the existing hydrostatic steering system, as shown in Figure 5-1. The valve required an analog voltage signal between 0 and 10 V as input. 174

PAGE 175

Figure 5-1. Electro-hydraulic retrofit of the tractor for automatic guidance A Stegmann incremental rotary encoder was mounted on the front steering cylinder for wheel angle feedback. A common 2.4 GHz P4 processor running Windows 2000 operating system was used for processing the vision and ladar algorithms. A Tern 586 engine microcontroller was used for executing real-time control of the servo valve and encoder feedback loop. An amplifier circuit was used to scale the control voltage from the microcontroller to the requirements of the valve. The guidance system architecture is shown in Figure 5-2. The encoder was calibrated by turning the wheels to different angles and recording the number of pulses given by the encoder. Calibration was determined as 7 pulses/ degree of turn. Power was taken from the tractors 12V DC battery and inverted to 120V AC for running the PC and the ladar. The 12V DC tractor supply was used to power the valve and the microcontroller. 175

PAGE 176

Figure 5-2. Guidance system architecture of the vehicle The camera used for machine vision was a Sony FCB-EX780S block single CCD analog color video camera with a standard lens and resolution of 640x480. A frame-grabber board was used to convert the analog signals to digital images. The ladar used was the SICK LMS 200, which has a maximum sweep angle of 180 degrees and a maximum range of 80 m. RS 232 serial communication at 115200 Baud was used to communicate with the ladar. A Starfire Differential DGPS receiver with 30 cm accuracy was used for determining the vehicle dynamics under open field conditions. The camera, ladar and DGPS receiver were mounted on top of the cab. The camera and the ladar were mounted at a 45 degree angle looking 1.5 m in front of the tractor as shown in Figure 5-3. Figure 5-3. Camera and ladar mounted on top of the tractor cab Ladar Microcontroller + Amplifier Camera PC Electro-hydraulic steering Encoder 176

PAGE 177

For machine vision, color was used as the discriminator for segmenting the path. To account for the varying weather conditions, images were collected over a period of 6 days in two months from morning to evening at half hour intervals. Based on this database of images, a segmentation algorithm was developed as shown in Figure 5-4. From the set of images, three types of conditions were observed that require the threshold value to be changed. These are cloudy day when the trees are darker than the path; bright sunny day when the trees are darker than the path but all pixel intensity values are elevated; and early morning and evening, when the sunlight causes the trees on one side of the row to be brighter than the path and the trees on the other side to be darker than the path. Two R, G, B threshold values are chosen from the image database corresponding to these conditions. To find these R, G, B threshold values, first the set of images corresponding to the early morning and evening conditions is taken. The highest of the R, G, B pixel values in the trees on the side which is darker than the path is chosen as the threshold value. Next the sunny day condition images are taken and the highest of the R, G, B pixel values in the trees on both sides of the path is taken as the threshold value. To accommodate these conditions automatically, an adaptive thresholding method was used as shown in Figure 5-4. 177

PAGE 178

Figure 5-4. Machine vision algorithm flowchart Then, a series of morphological operations were performed to clearly segment the path and a line was fit for the boundaries, using the least squares method. The error was calculated as the distance from the path center. Camera calibration was done to convert pixel distance to true distance. This was done by marking known distances on the ground in the cameras field of view 178

PAGE 179

and the number of pixels corresponding to these distances was counted. To eliminate fluctuation in path tracking due to natural light and camera noise, a time based outlier elimination routine was added to the error calculation. This routine assumes that the error data in a short interval of time of the order of 1 sec is normally distributed. When a new error data is found to lie outside two standard deviations of the data from the previous time interval, it is determined as noise and is not used to change the steering for the present time step. If however successive error data are in the same range, then they are taken into account for correcting the steering. An example of grove path estimation using the machine vision algorithm is shown in Figure 5-5. Note the path boundary depicted by the superimposed straight lines along the tree boundary. For navigation experiments on the test path, the color threshold for the machine vision algorithm was adopted for detecting hay bales instead of trees. A B C Figure 5-5. Machine vision results for citrus grove alleyway. A) Raw image. B) Tree canopy segmentation. C) Path boundary. The laser radar navigation algorithm employed a threshold distance based detection of hay bales. The flowchart of the algorithm is shown in Figure 5-6. The plot of the radial distance measured by the ladar for different angles, when driving through the test path, is shown in Figure 5-7. The discontinuities in the plot indicate the location of the hay bales. While scanning the field of view using laser radar, if such discontinuities are found in the data, they are marked as hay bales. The path center was determined as the center of the path, between the hay bales on either side. 179

PAGE 180

Get scan data Move window and look for discontinuity Find hay bale on one side Continue Moving window and look for discontinuity Find hay bale on other side Find path center as mid point between hay bales Path Error = Path center current position Figure 5-6. Ladar algorithm flowchart 180

PAGE 181

050010001500200025003000020406080100120140160180Angle (Degrees)Radial Distance (Cm) Figure 5-7. Radial distance measured by the laser radar in the hay bale path For the determination of the vehicles dynamics, open loop frequency response tests were conducted in an open field scenario. The tractor engine was set at 2200 rpm and three different speeds of 1.8 m/s (slow), 3.1 m/s (medium) and 4.4 m/s (fast) were set by varying the gears. As the tractor was run, the wheels were automatically turned as a time based sinusoid. The tractors position was recorded in real time using the DGPS receiver. The tests were conducted in an open field without any trees or buildings blocking the signal to the DGPS receiver. The receiver communicated with the PC via RS 232 serial communication. The steering turn frequency was performed from 0.07 Hz to 0.3 Hz. Frequencies below 0.07 Hz caused the tractor to veer out of the test area and frequencies above 0.3 Hz were thought to be detrimental to the tractor. It was observed that the vehicles position information obtained using the DGPS receiver showed a sinusoidal response at the same frequency as the steering angle. The lateral displacement of the vehicle for a given change in steering angle was recorded as the gain, where the lateral displacement is the displacement of the vehicle in a direction perpendicular to the direction of initial motion. The lateral displacement of the vehicle was calculated from the 181

PAGE 182

latitude and longitude data obtained from the DGPS receiver using ArcView 3.3 (ESRI Inc., Redlands, CA). The accuracy of ArcView 3.3 was verified with actual manually measured distance for a few test runs. The gain was plotted against frequency in a bode plot as shown in Figure 5-8. The response was observed to be linear. Visually fit lines were used to model this response. Magnitude (dB) Figure 5-8. Open loop theoretical and actual frequency response of the sinusoidal vehicle dynamics test (Phase = -180 deg) The sinusoidal vehicle dynamics data was used to calculate the transfer function between the steering angle and lateral displacement as G(s) = )(__)(_schangeangleSteeringsntDisplacemeLateral = 2^sk (Eq. 5-1) The gain k was speed sensitive and was determined as 0.02 (1.8 m/s), 0.07 (3.1m/s) and 0.13 (4.4 m/s) This model agreed with the model reported by Stombaugh et al. (1999). Based on the determined theoretical model, a PID control system was designed (Figure 5-9). The PID gains were tuned by simulation in Simulink, using the Ziegler Nichols method of 182

PAGE 183

tuning for a closed loop system. Tuning was optimized for zero steady state error and low overshoot. Figure 5-10 shows the 1 m step response when simulated for the 3.3 m/s speed. Figure 5-9. Simulated vehicle control system block diagram Figure 5-10. Simulated 1m step response of the vehicle at 3.3 m/s Experimental Procedure Typical citrus grove alleyways are practically straight rows. It was felt that testing the vehicle in straight and curved paths at different speeds would be a more rigorous test of the guidance systems performance than testing in straight grove alleyways. For this purpose, a flexible wall track was constructed using hay bales to form a test path. The hay bales provide a 183

PAGE 184

physically measurable barrier, which aids in operating the ladar guidance system and color contrast with grass which is useful for vision based guidance. In addition, when testing a large tractor, minor losses would occur if the tractor runs over a barrier of hay bales. The control algorithms used with machine vision or ladar based guidance are practically identical for either test track. The only modification in the vision algorithm was the threshold value for image segmentation. These threshold values were determined for both paths using techniques described earlier. The PID controller and the encoder performance are independent of the physical environment around the vehicle. The hay bale width was 45 cm and the boundary height was 90 cm after stacking hay bales two deep. Two types of path were used, a straight path and a curved path. The length of the straight path was 22 m. A 10m extension was added to the straight path to form the curved path. The radius of curvature was 17 m at the curve. The path width was 3.5 m throughout the length. The paths used in this study to evaluate the guidance system performance are shown in figures 11 (A) and 11 (B). A B Figure 5-11. Guidance system test path. A) Straight. B) Curved. Experiments were conducted to check the robustness and accuracy of the vision and laser radar guidance systems in navigating the different test paths at different speeds. Experiments were conducted for three different speeds, 1.8 m/s, 3.1 m/s and 4.4 m/s. These speeds 184

PAGE 185

corresponded to the perceived slow, medium and fast speeds for a tractor operating in a grove. The speeds were measured using an ultrasonic ground speed sensor. The vehicle was manually driven to the starting position, resulting in different initial offset and heading angles for each repetition. Once positioned, the vehicle was started under autonomous control and allowed to navigate down the path. Three repetitions were performed for each experiment. A rotating blade was attached to the draw bar, which marked a line on the ground as the vehicle moved. This line represented the path center traveled by the tractor. The errors were manually measured after each run using the distance from the marked line to the hay bale boundary. The error measurements were accurate to 0.1 cm. The measurements were taken at regular intervals of 30 cm. Due to the unpredictable nature of the starting and ending data, the first 5% and last 5% of the data record was discarded for statistical analysis. However, the plots show the full data record, which illustrates the random nature of the vehicle starting position. Data points were taken from each run as described above in order to calculate the path root mean square error, standard deviation, maximum error and average error. The RMS error equation is shown in Equation 2 RMS error= NiiNe12 (Eq. 5-2) Where, is the path error and N is the total number of data points. These measures were used to compare the performance of the two guidance sensors, the two different path configurations and the three speeds. ie After achieving satisfactory performance in the test path, trials were conducted in an alleyway of the citrus grove. For initial grove trials, only machine vision was used due to machine vision easily adapting to grove conditions. Whereas, ladar requires that additional data 185

PAGE 186

filters be designed to compensate for noise due to random orientation of the tree leaves. Future development is planned to test ladar in the grove. The machine vision guidance system was used to guide the tractor through an alleyway of a citrus grove on the University of Florida campus as shown in Figure 5-12. The average path width was 4.6 m. The tractor was driven to the grove and started in the automatic mode at the beginning of an alleyway. The guidance system was allowed to guide the tractor through the entire length of the alleyway. The performance was visually observed. Figure 5-12. Vehicle in the citrus grove alleyway Results and Discussion Tests were conducted on each of the sensor and track conditions to observe the behavior of the vehicle under various speeds. During each trial run, the path position and error were recorded manually using a tape measure. The RMS error, average error, standard deviation of the error and the maximum error were calculated for each experiment, by averaging the data over the three repetitions of each condition. The results were then compared for the different conditions. The path error, which is the deviation from the centerline of the aisle way, was plotted over the length of the test path for the different conditions. Path errors were adjusted for local variation in hay bales. The performance measures are shown in Table 5-1 for straight path and Table 5-2 for 186

PAGE 187

curved path. It should be noted that Table 5-2 data is broken down to the straight, curved and overall path errors. Table 5-1. Performance measures of the vehicles guidance system obtained from the experiments conducted in the straight test path Path Sensor Speed (m/s) Average Error (cm) Maximum Error (cm) Standard Deviation (cm) RMS Error (cm) 1.8 2 3.9 0.8 2.2 3.1 2.8 5.9 1.2 3.1 Vision 4.4 2.8 6.1 1.5 3.2 1.8 1.6 3.3 0.7 1.8 3.1 1.6 3.9 0.8 1.8 Straight Ladar 4.4 3.1 6.1 1.6 3.5 Figure 5-13 shows the performance of the machine vision guidance system at the three different test speeds. The average error was less than 3 cm for all speeds as noted in Table 5-1. The performance was better at 1.8 m/s than at the two higher speeds. As shown in Figure 5-13, the PID control is attempting to drive the error to zero. The RMS error is 3.2 cm at the fastest speed and demonstrates a good control performance visually similar to human driving. The machine vision guidance performed well in the straight path. The performance was better at lower speeds. -20-15-10-5051015200178356533711889106712451422160017781956213 4 Travel Distance (cm)Error (cm) A -20-15-10-5051015200178356533711889106712451422160017781956213 4 Travel Distance (cm)Error (cm) B 187

PAGE 188

-20-15-10-5051015200178356533711889106712451422160017781956213 4 Travel Distance (cm)Error (cm) C Figure 5-13. Performance of the machine vision guidance in the straight path. A) 1.8 m/s. B) 3.1 m/s. C) 4.4 m/s. Figure 5-14 shows the performance of the laser radar guidance system at the three different speeds. Once again the PID control is driving the error around zero. The average error at the lowest speed (1.6 cm) is almost half the error at the highest speed (3.1 cm). Laser radar is a more accurate measuring device than machine vision resulting in lower tracking errors, at the two lower speeds. However the ladar error is 10.7% higher at 3.1cm when traveling at 4.4 m/s. The machine vision algorithm was run at 20Hz, whereas the laser radar refresh rate was only 6 Hz at half degree resolution when using a standard serial port communicating at 115200 baud. The slower ladar refresh rate means that, less current data is available for the laser radar guidance system than the machine vision guidance system, in a given time interval. At high speeds of 4.4 m/s, this factor reduces the performance of the laser radar guidance system, whereas the machine vision guidance system is not severely affected. By using a high speed serial communication card (500 KBaud), the laser radar refresh rate can be increased and therefore its performance at speeds 4.4 m/s and higher, can be improved. None the less, the RMS error was only 3.5 cm at the fastest speed. -20-15-10-5051015200178356533711889106712451422160017781956213 4 Travel Distance (cm)Error (cm) A -20-15-10-5051015200178356533711889106712451422160017781956213 4 Travel Distance (cm)Error (cm) B 188

PAGE 189

-20-15-10-50510152001753505257008751049122413991574174919242099Travel Distance (cm)Error (cm) C Figure 5-14. Performance of the laser radar guidance in the straight path. A) 1.8 m/s. B) 3.1 m/s. C) 4.4 m/s. Figure 5-15 shows the performance of the vision and laser radar guidance systems at a speed of 3.1 m/s in the curved path. The performance of the two systems was visually as good as a human driving the tractor. The vision guidance had an average error over the entire track of 2.8 cm whereas the laser radar guidance system had an average error of 2.5 cm as shown in Table 5-2. The maximum error, rms error and standard deviation of machine vision based guidance, when run at 3.1 m/s in curved path seems to have done better than in the straight path. It is to be noted that the curved path consists of a straight section followed by a curved section. The cameras wide field of view is smaller than that of the ladar and therefore when the vehicle gets to the curved section, only one side of the path boundary is visible in the field of view. The machine vision tries to guides the vehicle at a constant distance from the visible boundary. This can be observed in Figure 5-15A (vision in curved path at 3.1 m/s). Since the curve starts at some where around 2100 cm travel distance, Figure 5-15A shows that the vision-based PID control does not attempt to drive the error to the center of the path once the tractor enters the curve. However, the ladar-based PID control in Figure 5-15B does continue to cross the zero error path center in the curve section. The maximum error, rms error and standard deviation of error in this curved section are lower than that for the straight section. When averaged over the entire path, the error parameters appear lower for the curved path than for the straight path. The laser radar guidance systems overall accuracy was slightly more accurate than the vision guidance when 189

PAGE 190

considering average error. As before, it performed better in the straight path section of the path, but was less accurate in the curve than the vision system. It also seems to exhibit larger swings about zero error than the vision system as evident by the larger maximum error. This may be due to some natural dampening that occurs in the less accurate vision measurement algorithm. This can be attributed to the fact that the laser radar is more accurate at detecting the path center than the vision guidance. -20-15-10-5051015200331663994132516571988231926502982Travel Distance (cm)Error (cm) A -20-15-10-50510152002965928881184148017752071236726632959Travel Distance (cm)Error (cm) B Figure 5-15. Performance in the curved path at 3.1 m/s. A) machine vision guidance. B) laser radar guidance. Table 5-2. Performance measures of the vehicles guidance system obtained from the experiments conducted in the curved test path at 3.1m/s Sensor Path Average Error (cm) Maximum Error (cm) Standard Deviation (cm) RMS Error (cm) Straight section (22m) 2.8 5.1 1.2 3.1 Curved section (10m) 2.7 3.9 0.4 2.7 Vision Overall (32m) 2.8 5.1 0.9 2.9 Straight section (22m) 1.6 3.9 0.8 1.8 Curved section (10m) 4.1 6.1 2.9 4.9 Ladar Overall (32m) 2.5 6.1 1.6 2.9 The good performance in the hay bale track encouraged trials in the citrus grove alleyway. The guidance system successfully guided the tractor through the alleyway. Initial trials using machine vision for guidance showed a lot of promise. The machine vision algorithm clearly segmented the path to be traversed. However, the non uniformity of the canopy surface caused the controller to over compensate for the undulation in canopy surface resulting in a more zigzag 190

PAGE 191

transit of the alleyway. Control enhancements are planned to further improve the guidance system performance in the grove. Conclusions Machine vision and laser radar based guidance systems were developed to navigate a tractor through the alleyway of a citrus grove. A PID controller was developed and tested to control the tractor using the information from the machine vision system and laser radar. A flexible curvature track was built using hay bales as boundaries and the vehicle was tested under straight path and curved path configurations. Tests were conducted at three different speeds of 1.8 m/s, 3.1 m/s and 4.4 m/s. Path tracking comparisons were made between the vision-based control and a ladar-based control. The traversed path was marked on the ground and measured manually. Error comparisons were made using RMS error and mean instantaneous error. It was found that the ladar-based guidance was the better guidance sensor for straight and curved paths at speeds of up to 3.1 m/s. Communication speed between the laser radar and the computer was a limiting factor for higher speeds. This problem can be solved by a faster communication speed using a high speed serial communication card in the PC. This card was not available during this research. Machine vision based guidance showed acceptable performance at all speeds and conditions. The average errors were below 3 cm in most cases. The maximum error was not more than 6 cm in any test run. These experiments demonstrated the accuracy of the guidance system under test path conditions and successfully guided the tractor in a citrus grove alleyway. However, additional testing is needed to improve the performance in the citrus grove. It was proposed that a control scheme, which used both machine vision and laser radar, may provide a more robust guidance, as well as provide obstacle detection capability. 191

PAGE 192

CHAPTER 6 SENSOR FUSION USING FUZZY LOGIC ENHANCED KALMAN FILTER FOR AUTONOMOUS VEHICLE GUIDANCE Introduction There is a current need in the Florida citrus industry to automate citrus grove operations. This need is due to reduction in the availability of labor, rising labor cost and potential immigration challenges. Autonomous vehicles would be an important part of an automated citrus grove. Autonomous vehicles can operate accurately for a longer duration of operation than when using a human driver. In a scenario where an automated citrus harvester is in operation, the operator can monitor both the harvester and the vehicle instead of being solely the driver. An autonomous vehicle with the ability to navigate in the middle of the citrus alleyways with an accuracy of 15cm deviation from the center would be beneficial. Numerous autonomous vehicles for agricultural applications have been described in the literature. The guidance system developed by Noguchi et al. (2002) was for operation in paddy fields for transplanting rice whereas the guidance system developed by Nagasaka et al. (2004) was tested in soybean fields. The development of an autonomous vehicle guidance system for citrus grove navigation using machine vision and laser radar (ladar) based guidance systems is discussed in Subramanian et al. (2006). Both machine vision and ladar based guidance performed with a maximum error of 6.1cm in test paths. The machine vision based guidance system had also guided a tractor through an alleyway of a citrus grove, keeping the vehicle visually in the center of the path with no collisions with the trees. It was found that ladar based guidance was more accurate than vision based guidance in well defined paths within a speed of 3.1m/s whereas the vision based guidance was able to keep the vehicle in the middle of the path in several types of paths but less accurate. The uneven tree canopy in citrus grove alleyways pose problems for guidance when using vision and ladar separately. With vision based guidance configuration used in this research, for 192

PAGE 193

example, with a forward facing camera, branches closer to the vehicle than the cameras field of view are not accounted for in the vision based guidance; navigating locations near the end of the row is not feasible with the vision system configuration as the last trees go out of the field of view of the camera. Ladar mount configuration used in this research, with a wider field of view and close range is useful for guidance in these situations. However, in a given alleyway, it is not uncommon to find a few trees absent. In such cases, ladar based guidance gives erroneous paths. These two guidance sensors provide complementary information for guidance. This paper presents the research further undertaken to fuse the information from these sensors to make the composite information from them more robust and make the overall guidance system more reliable. The development of the Kalman filter for fusion is first discussed followed by a description of the fuzzy logic system to augment the filter. Finally the experiments conducted to test the performance of the fused guidance system are described. Machine vision is a useful sensing methodology for guidance. One particular implementation has been used in this research (Subramanian et al., 2006). The method is easily adaptable from a custom designed test track to the citrus grove environment. Its long range aids in faster decision making before a hard turn is to be performed or for faster obstacle detection and is quite robust to minor variations in the path boundaries. However, it lacks the centimeter level accuracy of laser radar in detecting the distance to close range path boundaries or obstacles. This accuracy is useful in keeping the vehicle accurately positioned in the path. However, this accuracy makes the information noisy if there are minor variations in the path boundaries. This was a drawback in adapting the ladar based guidance to the citrus grove environment where the foliage caused large variations in the data about the path boundary, obtained from the ladar. 193

PAGE 194

Fusing the complementary information from these sensors, namely the long range information from vision and short range accuracy of the ladar, would be beneficial. Several methods for fusion have been reported in the literature. These include, but are not limited to, neural networks (Davis and Stentz, 1995; Rasmussen, 2002), variations of Kalman filter (Paul and Wan, 2005), statistical methods (Wu et al., 2002), behavior based methods (Gage and Murphy, 2000), voting, fuzzy logic (Runkler et al., 1998) and combinations of these (Mobus and Kolbe, 2004). The process of fusion might be performed at various levels ranging from sensor level to decision level (Klein, 1999). In our research, the tree canopy causes the sensor measurements to be fairly noisy. So the choice of method was narrowed down to be the one that could help in reducing the noise and also aids in fusing the information from the sensors. Methods such as neural network, behavior based methods; fuzzy logic and voting are not widely used primarily for reducing noise. Kalman filtering is a widely used method for eliminating noisy measurements from sensor data and also for sensor fusion. Kalman filter can be considered as a subset of the statistical methods because of the use of statistical models for noise. Paul and Wan (2005) used two Kalman filters for accurate state estimation and for terrain mapping for navigating a vehicle through unknown environments. The state estimation process fused the information from three onboard sensors to estimate the vehicle location. Simulated results showed feasibility of the method. Han et al. (2002) used a Kalman filter to filter DGPS (Differential Global Positioning System) data for improving positioning accuracy for parallel tracking applications. The Kalman filter smoothed the data and reduced the cross tracking error. Based on the good results obtained in the previous research for fusion and noise reduction, a Kalman filter was selected as the method to perform the fusion and filter the noise in sensor measurements. 194

PAGE 195

The use of a Kalman filter with fixed parameters has drawbacks. Divergence of the estimates is a problem which is sometimes encountered with a Kalman filter, wherein the filter continually tries to fit a wrong process. Divergence may be attributed to system modeling errors, noise variances, ignored bias and computational round off errors (Fitzgerald, 1971). Another problem in using a simple Kalman filter in our research is that the reliability of the information from the sensors depends on the type of path in the grove. For example, while navigating in the grove, if in a given position, where a tree is missing on either side of the path, the information from the ladar is no longer useful and guidance has to rely on the information from vision alone. Therefore, if a white noise model is assumed for the process and measurements, the reliability factor of a sensor in the Kalman filter has to be constantly updated. Abdelnour et al. (1993) used fuzzy logic in detecting and correcting the divergence. Sasladek and Wang (1999) used fuzzy logic with an extended Kalman filter to tackle the problem of divergence for an autonomous ground vehicle. The extended Kalman filter reduced the position and velocity error when the filter diverged. The use of fuzzy logic also allowed a lower order state model to be used. For our research, fuzzy logic is used in addition to Kalman filtering to overcome divergence and also to update the reliability parameter in the filter. Sasladek and Wang (1999) used fuzzy logic to reduce divergence in the Kalman filter. In this research, apart from using fuzzy logic to reduce divergence, fuzzy logic is also used to update the usability of sensors in different environmental conditions as discussed in sensor supervisor sub-section. Sasladek and Wang (1999) presented simulation results of their method, while this research implements similar methods on a real world system for citrus grove navigation application. Performance results are presented. The main objective of this research is to fuse the information from machine vision and laser radar using fuzzy logic enhanced Kalman filter to make the autonomous vehicle guidance 195

PAGE 196

system more reliable in a citrus grove than when using the sensors individually. The following specific objectives were chosen: Develop fuzzy logic enhanced Kalman filter fusion method. Confirm operation of the method by simulation. Implement the method on a tractor fitted with the required sensors. Test the navigation performance on custom designed test track. Test the navigation performance in citrus grove alleyways. Materials and Methods The vehicle developed by Subramanian et al. (2006) consisted of machine vision and laser radar as the main guidance sensors. A common PC with a 2.4GHz P4 processor using Windows XP operating system processed the information from the sensors and sent the error to a microcontroller (Tern, CA, USA). The microcontroller controlled the steering using a PID control system. Details of this controller are given in Subramanian et al. (2006). An encoder fed back steering angle measurements to the microcontroller. A tri-axial inertial measurement unit (IMU) (3DM, Microstrain, VT, USA) and an ultrasonic speed sensor (Trak-Star, Micro-Trak systems, inc. MN, USA) have been added to the existing system to aid the guidance and sensor fusion. The IMU provides the vehicle heading and the speed sensor provides the vehicle speed. These help in estimating the error using the Kalman filter. Communication with both of these sensors is through RS232 serial communication. The IMU was mounted inside the tractor cabin on the roof just below the ladar and camera. The speed sensor was mounted below the tractor. A special serial RS422 communication port was added to the PC to enable a ladar refresh rate of 30Hz. The architecture of the major components involved in the fusion based guidance system is shown in Figure 6-1. 196

PAGE 197

IMU Ladar Fuzzy Logic Fuzzy Kalman Filter Supervisor Figure 6-1. Fusion based guidance system architecture The vision and ladar algorithm are described in detail in Subramanian et al. (2006). The vision algorithm uses color based segmentation of trees and ground followed by morphological operations to clean the segmented image. The path center is determined as the center between the tree boundaries. The error is calculated as the difference between the center of the image and the path center identified in the image. The error is converted to real world distance using prior pixel to distance calibration. To calculate the required heading of the vehicle, a line is fit for the path center in the image representing the entire alleyway. The angle between this line and the image center line is determined as the required heading for the vehicle to navigate the alleyway with low lateral error. The ladar algorithm uses a distance threshold, to differentiate objects from the ground. An object of a minimum width is identified as a tree. The mid point between the trees identified on either side is determined as the path center. The errors measured using vision and ladar are adjusted for tilt using the tilt information from the IMU. The information from the sensors is used in the fusion algorithm and the resulting error in lateral position is passed to a PID control system which controls the steering angle to reduce the error to zero. Speed Sensor Steering Vision 197

PAGE 198

Kalman Filter A Kalman filter is used to fuse the information from vision, ladar, IMU and speed sensor. The models and implementation equations for the Kalman filter (Zarchan, 2005) are: Figure 6-2. Vehicle in the path with the state vector variables State transition model The state transition model predicts the coordinates of the state vector X at time k+1 based on information available at time k. x(k+1) = A x(k) + w(k) (Eq. 6-1) where k denotes a time instant, x(k) R4 denotes a state vector composed of the lateral position error of the vehicle in the path which is the difference between the desired lateral position and the actual lateral position, denoted by d(k) R, the current vehicle heading denoted by (k) R, the required vehicle heading, denoted by R(k) R and the vehicle linear velocity, denoted by v(k) R. It is to be noted that only the change in heading is used for estimating the position error, which affects the overall guidance. The absolute heading is included in the filter to remove noise from the IMU measurements. Figure 6-2 shows the vehicle in the path with the state vector variables. D V R d v 198

PAGE 199

x(k) = [d(k) (k) R(k) v(k)]T The state transition matrix, A(k) R4x4 is defined as A= 100001000010*001Sint where t denotes a time step and w denotes the process noise which is assumed to be a Gaussian with zero mean. denotes the change in heading and this value is calculated from the filtered IMU measurements for each time step k. The use of a variable sensor input such as in the state transition matrix is consistent with the method described in Kobayashi et al. (1998). The state estimate error covariance matrix P R4x4 gives a measure of the estimated accuracy of the state estimate x(k+1). P(k+1) = AP(k)AT + Q(k) (Eq. 6-2) where Q R4x4 denotes the covariance matrix for process noise w The estimation process is difficult to directly observe, hence the process noise covariance is uncertain. Therefore the initial variance values, which are the diagonal elements in the matrix, were assumed as follows: variance for process position noise 2cm, variance for process current heading noise 0.01degree, variance for required heading noise 0.01degree and variance for process velocity noise 10-4 m/s. These values were intuitively assumed to be the amount of noise variance that the process could add. As discussed further in this paper, the process noise covariance matrix Q, is updated at each step of the operation of the Kalman filter. Therefore, the initial assumptions of the matrix values do not significantly affect the overall guidance of the vehicle. These sensor measurements were assumed to be independent of the other sensors. This is done for simplicity and later it is observed from simulations and experiments that the guidance 199

PAGE 200

of the vehicle, with this assumption, is accurate enough to keep the vehicle close to the alleyway center. Hence the covariance values, which are the off-diagonal elements, are zero. Q == VXq000000000000qRCqq 10000001.0000001.0000024Measurement model This model defines the relationship between the state vector, X and the measurements processed by the filter, z. z (k) = H x(k)+u(k) (Eq. 6-3) where z(k) R5 denotes a state vector composed of measured values of the lateral position error of the vehicle in the path from vision algorithm, denoted by xC(k) R, the lateral position error of the vehicle in the path from ladar algorithm, denoted by xL(k) R, the required heading of the vehicle determined from vision algorithm, denoted by C(k) R, the current heading of the vehicle measured using IMU, denoted by IMU(k) R and the vehicle linear velocity measured using the speed sensor, denoted by v(k) R. z(k) = [xC(k) xL(k) C(k) IMU(k) v(k)]T The observation matrix, H(k) is defined as H = T10000010000010000011 The observation matrix relates the state vector x with the measurement vector z. The first two rows of H relate the lateral position errors from vision and ladar, the third and fourth rows relate the angles from vision and IMU and the last row relates the velocity. The measurement 200

PAGE 201

noise, denoted by u(k) is assumed to be a Gaussian with zero mean with standard deviations estimated experimentally with the different sensors. Filter gain The filter gain G is the factor used to minimize the posteriori error covariance P. G(k) = P(k+1) HT ( H P(k+1) )HT + R )-1 (Eq. 6-4) where R = measurement noise covariance matrix To determine R, the vehicle was set stationary in the middle of the path, each sensor (except the speed sensor) mounted on the vehicle was turned on independently and the information from the sensors was collected for 30 seconds. For the camera, the vision algorithm (Subramanian et al., 2006) was run for 30 seconds and the path error and the heading angle determined was recorded. For the ladar, the ladar path segmenting algorithm (Subramanian et al., 2006) was run for 30 seconds and the path error measurements were recorded. For the IMU, the sensor was run for 30 seconds and the heading angle measured by the sensor was recorded. For the speed sensor, the vehicle was run at 3 different constant speeds and the speed sensor measurement was collected for 30 seconds. The measurement noise covariance matrix, denoted by R, was determined from these recorded measurements and was found to be R == VimucXlXc00000000000000000000 0000000001.0000000017.00000015.00000007.1 where the variance of the error in lateral position of the vehicle, denoted by Xc was determined from the vision algorithm measurements, the variance of the error in lateral position of the vehicle, denoted by Xl was determined from ladar algorithm measurements, the variance 201

PAGE 202

of the error in heading, denoted by c was determined from vision algorithm, the variance of the error in heading, denoted by imu was determined from the IMU measurements and the variance of the error in velocity denoted by V was determined from the speed sensor measurements. xC and C are measurements from the same camera. However, the vision algorithm uses different regions of the image to estimate them. xC, xL, IMU and v are measurements from different sensors operating independent of each other. These measurements were assumed to be statistically independent. Therefore, the different sensor measurements were assumed to be statistically independent of each other. Hence the covariance values are zero. The variance of the velocity measurements was zero. This can be attributed to the low resolution (1mph) of the speed sensor. Predict Measurement update 1) G(k)=P(k+1)HT (H P(k) HT + R)-1 2) x(k+1)=x(k) +G(k)(Z(k)-H x(k)) 3) P(k+1)=(I-G(k) H) P(k+1) Time update 1) x(k+1)=A x(k) 2) P(k+1)=A P(k) AT + Q Initial x(k),P(k),Q Update Figure 6-3. Kalman filter operation Figure 6-3 shows the operation of the Kalman filter. The filter estimates the process state x at some time k+1 and estimates the covariance P of the error in the estimate as shown in the time update block. The filter then obtains the feedback from the measurement z. Using the filter gain G and the measurement z, it updates the state x and the error covariance P as shown in the measurement update block. This process is repeated as new measurements come in and the error in estimation is continuously reduced. 202

PAGE 203

Reliability Factor of Primary Guidance Sensors in the Kalman Filter and Fuzzy Logic Sensor Supervisor As discussed previously, ladar is accurate at short range for the given mounting arrangement and machine vision is good at providing overall heading of the vehicle. By experimentation in a citrus grove, the following observations were made: tree foliage is highly variable, trees can be absent on either or both sides of the path and some trees can be small enough for ladar to not recognize them as trees. In such a variable path condition, it is not feasible to have constant noise values for the ladar and vision. For example, when the ladar does not recognize trees on either side, guidance is more reliable when the process noise for the ladar is increased to a high value such that vision takes over as the only guidance sensor and for example when the vehicle gets to the end of a tree row, vision is not useful anymore and so ladar can be made the sole guidance sensor to cross the last tree. A fuzzy logic system was used to decide the reliability of the sensor. The reliability is changed in the Kalman filter by changing the measurement noise covariance matrix R. This is discussed in the following section. A fuzzy logic based supervisor is used to decide which sensor is more reliable at different locations in the grove alleyway. The input to the fuzzy logic supervisor is the horizontal distance of the vehicle center line from the trees on either side of the vehicle. Both vision and ladar input their corresponding distance values. Altogether, there are four input values; vision left tree distance, vision right tree distance, ladar left tree distance and ladar right tree distance. These inputs are divided into three linguistic variables: reasonable, unreasonable and zero (Figure 6-4). Inference Engine Vision Left A Fuzzifier Rule Base Defuzzifier Vision Vision Right Ladar Ladar Left Both Ladar Right 203

PAGE 204

Unreasonable Zero Reasonable Vision Ladar Both 1 1 0 0 1m 2m 3m B Figure 6-4. Fuzzy logic for sensor supervisor. A) Fuzzy logic architecture. B) Membership functions. The input membership function relates the input x to the linguistic variables reasonable, unreasonable and zero (Figure 6-4 B). The meaning of the linguistic variables is literally whether the distance from the tree row is reasonable or not. A zero membership for the ladar indicates the presence of obstacle in the path and for vision, the end of the row. x takes the value of zero for 0 to 1m, 1m to 3m for reasonable and any value greater than 2m for unreasonable. The ranges were intuitively chosen to be the possible values. Normalization of the linguistic variables was performed at a scale from 0 to 1. During the presence of an obstacle, the ladar takes precedence till the obstacle is avoided. The rule set for the supervisor is shown in Table 6-1. Each rule received a corresponding normalized membership value from the input membership function x. Based on the linguistic variables chosen from each x, the corresponding rules are fired. From the rules fired for a given instant, the output membership function is chosen. The membership value was used to find the crisp value for the output. The output membership function decision is shown in Figure 6-4 B, which relates the rules fired to the linguistic variables vision, ladar and both. These variables represent the decision made as to whether vision or ladar is given higher priority or if both should be given equal priority. Defuzzification was done using the center of gravity method. The crisp values of the decision, is taken as the measurement noise covariance value for that sensor. For example, if a decision of ladar is obtained with a crisp value of 1, Xl x 0 -3cm -1 2cm 0.2 Decision 204

PAGE 205

is update to 1 and Xc is chosen as 0. If a decision is made as both with a crisp value of -0.5, Xc is chosen as 0.5 and Xl is chosen as (-0.5/-1)*0.2 = 0.1. The positive or negative values only indicate whether the decision is vision or ladar. Only the absolute values are used to update R. The range of crisp values for the decision was chosen by performing simulations and determining what values provide good results. Table 6-1. Fuzzy logic rule set for sensor supervisor Vision Left/Right Ladar Left/Right Decision Reasonable Both Reasonable/Zero Ladar higher Zero Stop Unreasonable Vision Unreasonable/Zero Ladar Reasonable Reasonable/Unreasonable Vision higher Unreasonable Ladar Reasonable/ Zero Ladar higher Zero Ladar Unreasonable/ Zero Ladar Reasonable/Unreasonable Reasonable Both Divergence Detection and Fuzzy Logic Correction The innovation vector z(k) (Klein,1999), is defined as the difference between the measured vector and the estimation of the measurement vector given by equation 6-5. z(k)= z(k) H x(k) (Eq. 6-5) For an optimum Kalman filter, the innovation vector is a zero mean Gaussian white noise (Abdelnour et al., 1993). Therefore, the performance of the Kalman filter can be monitored, using the value of the innovation vector. Deviation of z from zero by more than 5% was taken as the indication of reduction in performance of the filter leading towards divergence. Divergence could be corrected by updating the process noise covariance matrix Q, depending on the innovation sequence. Fuzzy logic (Passino and Yurkovich ,1998; Subramanian 205

PAGE 206

et al., 2005) was used to perform this for two parameters qX and q R. The fuzzy logic architecture and membership functions are shown in Figure 6-5. A B Figure 6-5. Fuzzy logic for correcting divergence. A) Fuzzy logic architecture. B) Membership functions for updating Q The xc, xL and c parameters of the innovation vector z are the input to the fuzzy logic system. The ranges of the input are from -15% to 15%. These values were chosen by trial and error by performing simulation. Normalization of the linguistic variables was performed at a scale from 0 to 1. Based on the above argument for correction, a set of if-then rules were developed by experimentation. Each rule received a corresponding normalized membership value from the input membership function z. Based on the linguistic variables chosen from each z, the corresponding rules are fired. The rule set is shown in Table 6-2. For example, consider the second row in the Table where zX is negative and z c is positive, this indicates that the error in position has reduced compared to the previous time step but the vehicle is heading in the wrong direction, then the process noise for position qX is zero and the process noise for heading q R is positive. In the rule set, a single measurement value for position, zX is given instead of z(Xc/ XL, c ) Negative Positive 0 1 -15% 15% 0% 5% -5% Fuzzifier Inference Engine Rule Base Defuzzifier ~Q ~ Z Z Xc q X ZXL q R Z c Positive Zero 1 0 X=0 X=4 =0 =0.1 q (X, R) 206

PAGE 207

the two different values, z Xc and z XL This is because, the sensor supervisor described earlier, picks one of the sensors for use in the filter when the other is unreasonable. In the cases where one sensor is given higher priority, the position measurement of that sensor alone is used in the rule set and in the cases where both sensors are selected by the supervisor; the position measurement of ladar alone is used as it has lower measurement noise. This was done to reduce the complexity of the rule set. Table 6-2. Fuzzy logic rule set for divergence correction zX z c qX q R Negative Negative Zero Zero Negative Positive Zero Positive Positive Negative Positive Zero Positive Positive Positive Positive From the rules fired for a given instant, the output membership function is chosen. The membership value was used to find the crisp value for the output. Defuzzification was done using the center of gravity method (Passino and Yurkovich, 1998). The crisp values of the process noise covariance are then updated in the process noise covariance matrix Q. The range of crisp values was chosen by trial and error by performing simulations and determining what values provide good results. For the parameters, q c and qV, a linear relationship was followed such that q c and qV were increased proportional to zz' of C and v respectively. The speed of the vehicle obtained from the speed sensor was used to scale the output to the steering control. Higher speed requires more turn and vice versa. The transfer function between the required lateral displacement of the vehicle, to reduce the error to zero as determined from the fusion algorithm, and the steering angle is given by equation (6) and the gains were determined experimentally for speeds 1.8m/s, 3.1m/s and 4.4m/s (Subramanian et al., 2006). 207

PAGE 208

G(s) = 2sk (Eq. 6-6) To determine the steering angle at different speeds, the steering angle at 4.4m/s was scaled using equation (7). The scaling of the gain, G(s), for 4.4m/s produced the least error at different speeds compared to the gains for1.8m/s and 3.1m/s.. Steering angle = Gain Steering angle at 4.4m/s (Eq. 6-7) where gain = ( smSpeedCurrent/4.4_ ). Simulation The developed fuzzy-Kalman filter algorithm was simulated in MATLAB to check the functionality and accuracy. The data required for simulation was obtained by manually driving the vehicle through a citrus grove alleyway and collecting all the sensor information. Figure 6 shows the simulated result of fusing the error obtained from vision and ladar algorithms. The error is the distance between the center of the vehicle in the path and the path center. The result is shown for a section of the path from 16.2m to 16.8m of travel for clarity. It is observed that from 16.2m to 16.35m, only vision provides reasonable error and ladar does not. This may be attributed to the fact that ladar sweep at that location might have been in a gap between two consecutive trees. Hence, for the first 15cm distance, the fused data seems to have relied mainly on vision. For the 15.35m to 15.55m section, error from both vision and ladar algorithms seem reasonable. Hence, fusion takes both data into account. However, the fused data relies more on ladar information as the process noise is less for ladar. For the last 25cm of the section from 16.55m to 16.8m, both vision and ladar determine the same amount of error. 208

PAGE 209

Figure 6-6. Simulation result of fusion of error obtained from vision and ladar algorithms Experimental Procedure Typical citrus grove alleyways are practically straight rows. It was felt that testing the vehicle on curved paths at different speeds would be a more rigorous test of the guidance systems performance than directly testing in straight grove alleyways. This test represents the ability of the guidance system to navigate more difficult conditions than the ones likely to be encountered in the grove. For this purpose, a flexible wall track was constructed using hay bales to form an S-shaped test path. This shape provides both straight and curved conditions. The path width varied from 3m to 4.5m and the total path length was approximately 53m. A gap of approximately 1m was left between successive hay bales. The gaps mimic missing trees in a citrus grove alleyway. The profile of the hay bale track is shown in Figure 6-7. The hay bales provide a physically measurable barrier, which aids in operating the ladar guidance system and color contrast with grass which is useful for vision based guidance. In addition, when testing a large tractor, minor losses would occur if the tractor runs over a barrier of hay bales. 209

PAGE 210

A B Figure 6-7. Hay bale track profile. A) Dimensions. B) Photo. Experiments were conducted to check the robustness and accuracy of the fused guidance system in the test path at 1.8m/s and 3.1m/s. The vehicle was manually navigated to the starting position at one end of the test path. Once positioned, the vehicle was started in autonomous control and allowed to navigate down the path to the other end thereby navigating a distance of 53m. Three repetitions were performed for each vehicle speed to ensure replication resulting in a total of six trial runs. The path traveled was marked on the soil by a marking wheel attached to the rear of the tractor. The error measurements, along the vehicle centerline, were taken manually using a tape measure at regular intervals of 1m. This interval was chosen to get enough samples to correctly estimate the path error. Measurements at smaller intervals were redundant. After achieving satisfactory performance, wherein the vehicle stayed close to the center of the path without hitting the hay bales, in the test path, tests were conducted in orange grove alleyways at the Pine Acres experimental station of the University of Florida. The average alleyway path width was approximately 3.5m and the length of the alleyway was approximately 110m with about 30 trees on either side. A typical alleyway is shown in Figure 6-8, where the approximate tree height is 1.5m to 2m. Figure 6-9 shows the field of view of the camera and the ladar when navigating the citrus grove alleyway. 3.5m 20m 15m 15m 110o 110o 210

PAGE 211

Figure 6-8. Citrus grove alleyways A B Ladar FOV Ca m e ra Lad a r FOV FOV Ca m era FOV 3.5 30m Figure 6-9. Field of Vi ew (FOV) of cam era and ladar. A) Side view. B) Top/Front view. Experim e nts were conducted at speeds of 1.8m /s and 3.1m /s along an alleyway. Three repetitions runs were co nducted for each speed. Fo r m easuring the error, m a rkers were laid on the ground at the path center at approxim ately 4m intervals along the al leyway. The test path w ith harde r turns requ ire d a sam p ling inte rval of 1m w h ereas the grove alleyw ay w a s a rela tive l y straight path and theref ore a sam p ling interval of 4m was sufficien t. A rear facing cam era was mounted on top of the tractor cab in. The cam eras field of view included the area just behind the rear of the tractor. P i xel to hor izon tal distan ce inf o rm ation w a s calibra ted. W h ile cond ucting the experim e nts, the video of the ground behind the tractor was collected using this cam e ra. The error in trav ersing the alleyway was determ ined based on the position of the m a rkers at different 211

PAGE 212

instances in the video. The boundary of the path, for determining the path center in both test track and in the grove, was manually done by visual approximation of the boundary. Results and Discussion From the experiments, RMS error, average error, maximum error and standard deviation of error were calculated over the entire path by averaging the data over the three repetitions at each speed. The path error, which is the deviation from the center line of the alleyway, was plotted over the length of the test path for the different speeds. The positive and negative values of the error in the plot are only an indication of the error being on the right side or left side of the path center. For calculating the performance measures, the absolute values of the error were used. Performance measures obtained from the experiments conducted in the test track are shown in Table 6-3 and the corresponding path error is shown in Figure 6-10. Visually, the vehicle navigated in the middle of the path as well as a human would. The average error in both cases is less than 2cm from the path center. The measures are lower for the lower speed of 1.8 m/s than at 3.1m/s. The maximum error at both speeds is less than 4cm in the path width of at least 3m. This is an error of only 1% of the path width. From Figure 10 A, it can be observed that most of the time, the error is less than 2cm and only in few instances, the maximum error goes over 2cm. The results of running vision and ladar based guidance separately in similar but not identical test tracks are reported in Subramanian and Burks (2005). However, the path used in the present research is more complex. At speeds of 3.1m/s an average error of 2.5 cm was reported while using ladar based guidance, whereas guidance based on fusion has an average error of 1.9 cm at the same speed. The maximum error is also reduced from 5 cm reported for vision based guidance at 3.1 m/s to 4 cm using fusion based guidance. Compared to those methods of individual sensor guidance, navigation using sensor fusion resulted in better performance in staying closer to the path center. 212

PAGE 213

As directly observed from the simulations, the reduction of noise also significantly played a role in the improvement of performance. This is in agreement with what is reported in Han et al. (2002), that use of Kalman filter reduced the noise and improved tracking performance for a DGPS based parallel tracking. Kobayashi et al. (1998) report high accuracy by using a fuzzy logic Kalman filter based fusion method using DGPS, in accordance with the results reported in this paper. Sasladek and Wong (1999) used fuzzy logic for tuning the Kalman filter with DGPS receiver as the primary sensor and report that the simulation results showed prevention of divergence and improved accuracy in position and velocity. Comparatively, this research used vision and ladar as sensors and prevented divergence using fuzzy logic for tuning. In addition, fuzzy logic was also used for picking the more accurate sensor and improved performance was experimentally verified for a citrus grove application. Compared to many such similar research reported in literature, this research has significant improvements and differences. Whereas many researches use DGPS receiver as the primary sensor for their application, this research uses vision and ladar as primary sensors for guidance as well as for fusion. Secondly, this research uses fuzzy logic for tuning or divergence correction as well as for adaptively choosing what sensor is to be relied upon for fusion at different locations in the path whereas most previous research uses fuzzy logic exclusively for tuning or divergence detection. This method allows for switching between important sensors useful in specific areas. Further, previous researches in agriculture have mainly been for crop applications, whereas this research is for citrus grove operations. 213

PAGE 214

Table 6-3. Performance measures obtained from the experiments conducted in test track Speed (m/s) Average Error (cm) Standard Deviation (cm) Maximum Error (cm) RMS Error (cm) 1.8 1.5 0.7 3 1.6 3.1 1.9 1 4 2.1 -25-20-15-10-5051015202516111621263136414651Distance (m)Error (cm) A -25-20-15-10-5051015202516111621263136414651Distance (m)Error (cm) B Figure 6-10. Path navigation error of the vehicle in the test track. A) 1.8m/s. B) 3.1m/s. Performance measures obtained from the experiments conducted in the citrus grove alleyway is shown in Table 6-4 and the error plot is shown in Figure 6-11. The average error at the two speeds is less than 10cm and the maximum error is less than 22cm for an average 214

PAGE 215

alleyway width of 400cm. This is about 6% of the alleyway width. From the error plot, it can be observed that the error is less than 15cm most of the time. Based on the performance measures, the guidance seems less accurate in the relatively straight grove than in the complex test path. However, it should be noted that the path boundary conditions in the grove are much more variable than in the test path. In the test track, the path boundary is relatively even without much variation in the location between successive hay bales. In the citrus grove, the tree canopy size varies significantly from one tree to the next, the variation being as low as 5 cm to as high as 50cm. The path boundary which is the tree canopy edge, used for error measurement, was decided by the person conducting the experiment and it is not practical to accurately state a tree canopy boundary. The larger errors in the citrus grove can be largely attributed to this boundary variability rather than the navigation performance of the vehicle. These high errors are still acceptable to navigate sufficiently in the middle of the alleyway without hitting any tree branches. Visually, the navigation in the alleyway was as good as human driving and as good as navigation in the test track. Overall, assuming a human driver as a reference of performance, navigation in the citrus grove was as good as human driving with reasonable errors and confirms the good performance of fusion based guidance. Table 6-4. Performance measures obtained from the experiments conducted in grove alleyway Speed (m/s) Average Error (cm) Standard Deviation (cm) Maximum Error (cm) RMS Error (cm) 1.8 7.6 4.1 18 8.6 3.1 9.4 4.4 22 10.3 215

PAGE 216

-25-20-15-10-5051015202542036526884100Distance (m)Error (cm) A -25-20-15-10-5051015202542036526884100Distance (m)Error (cm) B Figure 6-11. Path navigation error of the vehicle in the grove alleyway. A) 1.8m/s. B) 3.1m/s. Conclusions A sensor fusion system using fuzzy logic enhanced Kalman filter was designed to fuse the information from machine vision, ladar, IMU and speedometer. Simulations were performed to confirm the operation of the method. The method was then implemented on a commercial tractor. The fusion based guidance system was first tested in custom designed test tracks. Tests were conducted for two different speeds of 1.8m/s and 3.1m/s. An average error of 1.9cm at 216

PAGE 217

3.1m/s and an average error of 1.5cm at 1.8m/s were observed in the tests. The maximum error was not more than 4cm at both speeds. Tests were then conducted at the same speeds in a citrus grove alleyway. The average error was less than 10cm for both speeds. The guidance systems ability to navigate was verified in citrus grove alleyways and was found to perform well with the vehicle remaining close to the center of the alleyway at all times. The developed fusion based guidance system was found to be more reliable than individual sensor based guidance systems for navigating citrus grove alleyways. The present system is able to reliably traverse through an alleyway of a citrus grove. Additional work is necessary before the system can be ready for production. Some of the future work would include adding the ability to navigate the headlands and the ability to optimally navigate the entire grove. 217

PAGE 218

CHAPTER 7 HEADLAND TURNING MANEUVER OF AN AUTONOMOUS VEHICLE NAVIGATING A CITRUS GROVE USING MACHINE VISION AND SWEEPING LADAR Introduction There is a need to automate grove operations in the Florida citrus industry and autonomous vehicles form an integral part of this automation process. An autonomous vehicle can be used for applications such as scouting, mowing, spraying, disease detection and harvesting. The development of autonomous vehicle guidance systems using machine vision and laser radar for navigating citrus grove alleyways was discussed in Subramanian et al. (2006). This paper discusses the headland navigation algorithms. Complete navigation of a citrus grove requires the vehicle to turn at headlands in addition to navigating the alleyways. Headland in citrus groves is the space available at the end of each tree row. This space is generally used for turning the vehicles navigating the rows. Headlands are characterized by a ground space of at least 4.25m after the last tree in each row. The ground is usually covered with grass. Beyond this space, there may be trees planted to act as boundaries of the grove or there could be more open ground or some combination of both. An autonomous vehicle operating in citrus groves needs to navigate the headlands to turn and navigate other rows. Miller et al. (2004) have reported that that the space required to turn at the headland and the time spent in turning, significantly affect the field efficiency. Hague et al. used vision based guidance in combination with dead reckoning to guide a vehicle through crop rows. In this paper, turning maneuvers to navigate the headlands of a citrus grove are discussed. Materials and Methods The vehicle used for developing the headland turning algorithms was the John Deere eGator. The sensors used for developing the headland navigation algorithms are the camera, the ladar, the IMU and the wheel encoder. The details of these sensors and their mounting location 218

PAGE 219

were described earlier. The camera is mounted looking down at an angle of 10 degrees with the horizontal and the ladar is mounted looking down at an angle of 15 degrees with the horizontal. The camera, ladar and the IMU are interfaced with the 2.4GHz CPU running Windows operating system. This is the same computer that was used on the tractor. All high level algorithms for processing guidance sensor information and guidance algorithms are implemented in this shoebox computer. The speed control and the steering control are implemented in the PC104 computer described earlier. The steering control system used in the autonomous steering of the tractor was the PID control, in which lateral position error of the vehicle was given as input to the controller and steering was suitably controlled to minimize the error. In alleyway navigation, the specification of the lateral position of the vehicle in the path can be described as error of the vehicle from the path center. For headland navigation, there is no specified path that the vehicle must take. Therefore it is not practical to specify a path error in all conditions. Also, the development of classical controllers such as PID, a mathematical model of the system is useful for tuning the controller parameters. For the tractor, this model was experimentally determined. Development of a similar model for the eGator is time consuming. Therefore, a new type of steering controller was explored which does not require error specification or mathematical model, which was the Pure Pursuit steering. Pure Pursuit Steering Pure Pursuit steering mechanism has been used for many years in the Carnegie Mellon NavLab autonomous vehicle projects and has been reported to be robust. The method is simple, intuitive, easy to implement and robust. It is analogous to human driving in that humans look a certain distance ahead of the vehicle and steer such that the vehicle would reach a point at the look-ahead distance. The Pure Pursuit method is similar to the human way of steering. Pure Pursuit is a method for geometrically calculating the arc necessary for getting a vehicle onto a 219

PAGE 220

path (Coulter, 1992). Consider the illustration of the vehicle with the X-Y coordinate system shown in the Figure 7-1. Y (p,q) Figure 7-1. Pure Pursuit steering method Let the origin of the coordinate system be the center of the vehicle. Let (p, q) be a point some distance ahead of the vehicle. Let an arc join the origin and the point (p, q). Let L be the length of the cord of the arc connecting the origin to the point (p, q) and let R be the radius of curvature of the arc. Let R be expressed as a sum of distance p and some distance D. R = D + p => D = R p (Eq. 7-1) L, p and q form a right triangle. Using Pythagoras theorem for a right triangle p2+q2= L2 (Eq. 7-2) q, R and D for a right triangle. Using Pythagoras theorem, q2 + D2 = R2 (Eq. 7-3) => q2 + (R-p)2 = R2 Simplifying this equation, R = pL22 = 2p + pq22 (Eq. 7-4) R X L q D p Vehicle 220

PAGE 221

As this method is analogous to human driving, we can choose a predetermined look-ahead distance p. For this project, the look-ahead distance was chosen to be 2.89m from the front of the vehicle. For alleyway navigation, the error of the vehicle from the path center is q at the look-ahead distance p For non-alleyway navigation, q can be the required lateral movement of the vehicle at the look-ahead distance. This value of q is obtained from the machine vision and ladar algorithms for path segmentation. Using these values of p and q, R can be calculated from the above equation. R is the required radius of curvature of the vehicle to get the vehicle to the required point. In the CAN based steering system of the e-Gator, steering has to be specified as steering angles and not as radius. Therefore, R has to be converted to steering angle. A calibration experiment was conducted to find a relationship between the radius of curvature and the steering angle. In the experiment, the eGator was taken to an open area. Steering angle was kept constant by sending CAN commands to the steering controller. The vehicle was moved forward by manually stepping on the accelerator. For a constant steering angle, the vehicle moved in a constant circle. Markers were placed on the ground at the inner wheel radius at different locations. The diameter of the circle was manually measured using a tape measure. The steering angle was varied to obtain different radius of curvature. The illustration of the experiment is shown in the Figure 7-2 and the experimental results are shown in the Table 7-1. Vehicle R Figure 7-2. Illustration of experiment for steering angle calibration with radius of curvature 221

PAGE 222

Table 7-1. Calibration of radius of curvature of vehicle with steering angle Steering Angle (Degrees) 10 15 20 25 30 Left 10.6 9.9 7.46 5.8 4.4 Radius (m) Right 7.9 6.4 5.33 4.6 3.8 Experiments were conducted for steering angles from 10 degree to 30 degrees at 5 degree intervals. The lowest angle chosen was 10 degrees because smaller angle had much larger radius that caused the vehicle to travel out of the experimental area. The maximum angle of 30 degrees is the physical limit of the vehicles steering system. Experiments were conducted for both left and right turns. The experimental values were plotted in the chart shown below. y = -0.33x + 14.232y = -0.2x + 9.60602468101205101520253035Steering Angle (Degrees)Radius of Curvature (m) Left Right Linear (Left) Linear (Right) Figure 7-3. Calibration curve between radiuses of curvature of vehicle with the steering angle 222

PAGE 223

From the chart, linear equations were fit to calibrate steering angle with radius of curvature. The equations are: Left turn: Radius = -0.33* Steering angle + 14.232 (Eq. 7-5) Right turn: Radius = -0.2 Steering angle + 9.606 (Eq. 7-6) Rearranging the terms, Left turn: Steering angle = -3.03* Radius 43.12 (Eq. 7-7) Right turn: Steering angle = -5 Radius 48.03 (Eq. 7-8) Using these calibration equations, the steering angle required for various radius of curvature was determined. The steering angles were sent to the steering controller. It is observed that the minimum radius of curvature of the vehicle was 4m. This minimum value limits the smallest turn that the vehicle can execute in the headlands. The flowchart of the Pure Pursuit steering algorithm is shown in the Figure 7-4. Figure 7-4. Flowchart of Pure Pursuit steering algorithm Use look-ahead distance and offset to calculate radius of curvature Use calibration equations to determine steering angle from radius of curvature Obtain required vehicle offset from vision or ladar Send steering angle commands to stee rin g co n t r o ll er 223

PAGE 224

Headland Turning Maneuvers As mentioned before, the am ount of time spent and the area required by the vehicle to turn at theg h of the headland while navigating the alleyway is perfo headlands is important. A headland turning maneuver can be performed as a series of the following steps: 1) Determining the approach of the end of current row while navigating the alleyway. 2) Precisely establishing the end of the row. 3) Turning into the headland. 4) Turnininto the next row. 5) Getting to a position to navigate the next row. Determining the approach of the headland The first step of determining the approac rmed using machine vision. Consider the Figure 7-5 which shows the image obtained from the camera when the vehicle was in the middle of the alleyway (a) and the image obtained from the camera when the vehicle was close to the end of the alleyway (b),(c). A B C Figure 7-5. Images obtained from the camera when the vehicle is navigating the alleyway 224

PAGE 225

The segmentation algorithm used originally for path determination for headland navigation, segments the entire image for trees and pathway. When the segmentation algorithm is run on the above images, the resulting images are as shown below. A B C Figure 7-6. Segmented images It is observed from the segmented images that when the vehicle is close to the end of the path, the upper half of the segmented image is predominantly either white or black depending on whether trees are absent or present in the headland. For the case when the vehicle is in the middle of the alleyway, the upper half of the image is both black and white without any color being predominant. This difference in the upper half of the image for close to headland cases and middle of alleyway cases is used in the vision algorithm. 225

PAGE 226

The headland determination vision algorithm uses two fixed boxes, large/green and small/red as shown in the Figure 7-7. A B C Figure 7-7. Vision algorithm example for headland detection The two lines, joining the lower corners of the image to the lower corners of the lower box, indicate the path boundaries as obtained from path segmentation algorithm. For the case in the middle of the alleyway, the smaller box falls on the pathway whereas the bigger box encompasses random objects including the headland (A of Figure 7-7). When the vehicle is close to the headland, the two boxes fall on headland objects (B and C of Figure 7-8). As observed in the images, the headland often consists of either trees or open space. The use of the boxes is shown in the segmented images below. 226

PAGE 227

A B C Figure 7-8. Use of boxes in the segmented images Close to headland, the pixels in the two boxes are both classified as either trees or pathway whereas in the middle of the alleyway case, the two boxes are classified differently. Based on this difference in the boxes, the algorithm determines the approach of the headland. By trial and error, it was found that using this vision based method of detecting the approach to the headland, detection of the headland is always determined when the vehicle is at a distance of approximately 29m to 31m from the headland. As observed this distance is not accurate and the variation is from row to row. This is however a good estimate of the distance to the headland. Once the approach of the headland has been detected by the vision algorithm, this distance is used to dead reckon the vehicle through rest of the alleyway till the headland has been reached. A flag is sent to the PC104 computer indicating that the headland has been detected. In the PC104 computer, dead reckoning is performed using the information from the encoder. The equation for the dead reckoning process is given as Dc = Dc + de (Eq. 7-9) 227

PAGE 228

where Dc is the current distance and de is the change in the distance measured by the encoder. Also, when the headland is detected using the machine vision algorithm, the priority of ladar in the sensor fusion process is increased and the priority for vision is reduced to zero. This is because, close to the headland, the image from the camera primarily consists of headland objects. This is also due to the long field of view of the camera. The vision algorithm development was based on segmenting trees and path. Close to headlands, this segmentation is not feasible as the alleyway trees are out of the cameras field of view. On the other hand, ladar with the short field of view is able to detect the last few trees in the alleyway and perform the navigation. Therefore, navigation is switched entirely to ladar. Precisely establishing the end of the row As mentioned earlier, the vision based headland detection gives an estimate of where the headland is located and the dead reckoning allows the vehicle to get close to the headland. A midsize citrus tree with some leafy canopy has a width of a minimum of 2m. The estimated distance using vision followed by dead reckoning allows the vehicle to get to a position such that the last tree is to the side of the vehicle. Once the dead reckoning is completed and the vehicle has reached the estimated distance near the last tree, the vehicle is moved further forward while the ladar data is monitored. This progress is continued until the ladar algorithm does not detect any trees on either side. When this happens, the vehicle is assumed to have crossed the last tree. This ladar monitoring ensures that the vehicle crosses the last tree of the alleyway. Drawback of the vision based headland detection In the vision based headland approach detection algorithm described earlier, the assumption was that the headland consists either of uniform headland trees or open space. The 228

PAGE 229

caveat with the vision based headland detection is that the headland is not always composed entirely of trees or open space. A sample image is shown in the Figure 7-9. Figure 7-9. Sample image where the vision based headland detection can fail In the above image, a single tree is present in the headland along with an artificial object. In such a case, the vision based algorithm fails to precisely detect the headland. Presence of random objects makes the vision algorithm less accurate at determining the distance to the headland. In the previously described method, the vision based headland detection with dead reckoning helped the vehicle reach the last tree of the alleyway. With the failure modes of the vision based detection, the estimate of the headland may not be enough to reach the last tree or may take the vehicle beyond the last tree. To overcome this problem, a ladar based algorithm is described as follows. Precisely establishing end of the row using sweeping ladar The currently used mount angle of the ladar is such that it is able to detect only the closest tree to the vehicle. It was proposed that if the ladar is rotated through a range of angles, many more trees could be detected in front of the vehicle. The precise distance measuring capability of the ladar would help in establishing the end of the row. In this method, a digital servo motor 229

PAGE 230

(HiTec HSR-5995TG) was custom attached with the ladar. A custom designed power supply regulator was built to power the motor. A PIC microcontroller based serial communication interface was also custom built to interface with the motor. The motor-ladar mount is shown in the Figure 7-10. A B Ladar Ladar Motor Motor Figure 7-10. Ladar and motor assembly for sweeping. A) view 1. B) view 2. The motor was attached to the ladar body such that when the motor shaft rotated, the ladar was oscillated up and down about the horizontal plane. The ladar was swept from an angle of 15deg. above the horizontal to 30deg. below the horizontal at a speed of approx. 15deg/s (Figure 7-11). The range of angles was chosen such that the ladar sweep covered only the tree locations. Angles beyond 15 degrees above the horizontal resulted in the ladar sweeping over the trees and the angles beyond 30 degrees below the horizontal resulted in the ladar sweeping the ground. The angles were also limited by the range of the mechanism linking the motor to the ladar. The speed of rotation was a limit of the servo motor. 230

PAGE 231

15 o 30 o Figure 7-11. Illustrati on of the ladar sweep As m e ntioned earlier, the ladar ref r es h rate is app r oxim a tely 3 8 H z From the m o tor sweep speed and th e lada r ref r e s h rate, d a ta available per degree is approxim a tely 15 38 = 2.53 = 2 full sweeps per degree Therefore, a ladar s can is obtain e d every half a d e gree at th is sweep rate o f the m o tor. In this sw eep in g ladar algor ithm the m o tor is allow e d to execute one sweep of the ladar from up to down. The entire data obtained from the ladar duri ng this period is stored in an array. Using the calculations m e ntioned above, the total num ber of scans of the ladar is then m a tched with 0.5 degree positions of the sweep angle. One sweep of the lada r by the m o tor at 15deg/s takes approxim a tely 5s. Synchronization of the ladar a nd sweep angle consum es a longer tim e That is, if the m o tor was steeped to m ove to 0.5 degree in crem ents and then allowed to acqu i re a full ladar s can at each angle, the tim e consum ed to execute one fu ll sweep wo uld be m u ch higher. The overall allocation of scans to sw eep angles elim inates th e need to sy nchronize th e sweep angle positions with the ladar scan and thereby saves tim e Also, it was experim e ntally observed 231

PAGE 232

that the error introduced due to 0.5 degree allocation of scans is not significant to warrant synchronization. As mentioned in Chapter 3, the data obtained from the ladar is in polar coordinates with the ladar being the origin. For this algorithm, it is intuitive to use data in Cartesian coordinates. Consider the coordinate system shown in the Figure 7-12. Figure 7-12. Coordinate systems Let P be a data point at a radius R from the origin subtending angles m and n with the z and y axis respectively. To convert this to the Cartesian coordinate system, the following equations were used. Px = -R Cos( 2n ) Cos(m) (Eq. 7-10) Py = R Sin( 2n ) Cos(m) (Eq. 7-11) Pz = R Sin( 2n ) Sin(m) (Eq. 7-12) (Px, Py, Pz) is the data point in the Cartesian coordinate system. The data obtained by the scan is then filtered to keep the relevant data. It is known that the lateral distance of trees in the R P(R,m,n) z m y n x o 232

PAGE 233

row, fhe rom the ladar can range from a distance of 0.5m to a maximum of 5m. The ladar to the vehicles end is 0.5m. So, a tree limb is not expected to be closer than 0.5m during normal operation. Distances longer than 5m correspond to trees located in adjacent rows. Therefore, tfilter was implemented such that any data beyond 5m and data less than 0.5m were removedAfter filtering, a sample plot of the data in the Cartesian coordinates is shown in the Figure 7-13. Figure 7-13. Swept ladar data in Cartesian coordinates It is observed from the figure that the data corresponding to different trees are clustered rees in front of the vehicle can be detec together. It is also observed that only four consecutive t ted with confidence. Although the range of the ladar is 80m, the trees further away contribute very few data points to be clearly detected. 233

PAGE 234

An algorithm was developed to determine the tree locations from the data. The Figushows the data in 3-dimensional space. The search spac re 7-13 e for the algorithm, when analyzing data in 3-dal plane Figure 7-14. 3-dimensional ladar data projected on the horizontal plane It should be noted that multiple points at a location with different z varlapped the location of the trees, a clus imensions is large. This results in the consumption of a large amount of computer processing time. For real-time precise headland detection, this is significant. The aim is to locate the trees with respect to the vehicle in the x-y plane. The location of the data in the verticis not important. Without loss of useful information, the 3-dimensional data is projected on to the horizontal plane. The projected data is shown in the Figure 7-14. lues are ove in the Figure 7-14. The data space consists of 1cm x1cm bins. To locate tering algorithm is used. The clustering algorithm is as follows: Scan the data space using a 50cm x 50cm window in 50cm increments 234

PAGE 235

If the window in a location consists of more than 5 data points, note the location of the window ion indicates the presence of a tree or a part of a tree 250cm, the difference is due tond is The locat If two consecutive tree containing locations differ by more than the presence of headland. It is then determined that the headland starts just after the last tree location prior to this gap. Based on the location of the trees in the alleyway, the exact location of the headland is determined. The algorithm and clustering ensure that the headlafound precisely and the determination is accurate to less than 50cm beyond the last tree in the alleyway. The result of clustering and headland determination is shown in the Figure 7-15. Figure 7-15. Tree clustering and headland determination ping ladar based establishment of headland were en the Y Tree Cluste HeadlandHeadland object rs Vision and ladar based headland determination The vision based headland detection and swee described above. As mentioned, the sweeping ladar process takes approximately 5s to execute a sweep. It is not practical to execute the sweep while navigating the entire row. Therefore, sweep is started only when the vision based algorithm detects the headland. Thsweeping ladar algorithm takes over. The dead reckoning is no longer required. On further tests conducted in the grove using the above algorithms, it was also found that a stationary ladar mounted at an angle of 10 degree to the horizontal, is able to detect four trees similar to the 235

PAGE 236

sweeping ladar. This mounting angle increases the operating range of the ladar. This angle however, may not be suitable for close range obstacle detection. The stationary mount of theladar also obtains less data compared to the sweeping ladar. This however does not prevent thprecise determination of the headland and is as accurate as the sweeping ladar. The sweeping ladar is still a useful development for future operations in this project. Once the headland location is established, the vehicle speed is reduced to headland navigation speed. Navigating the headland and entering next row e by the vision algorithm and the exact locati tree f the vehicle using the yaw angle from the IMU. 2. pletely to one end such that the vehicle moves opposite to the direction of the required turn. This step is to make a wide u-turn. 3. end such that the vehicle turns in the required direction. Allow the vehicle to move forward. 5. es in front of the vehicle using the ladar. If an obstacle is detected, stop the vehicle. From the location of the headland determined on determined by the sweeping ladar, the turning maneuvers are started when the lastin the alleyway has been passed. Two turning maneuvers namely the u-turn and the switch back turn were developed. The determination of which maneuver is to be executed is made using the sweeping ladar algorithm. If an open space of more than 3m is available in the headland, the vehicle proceeds to do a u-turn and switch back turn otherwise. For the turning maneuvers, algorithms for steering similar to human operations were developed. For the development oturning algorithms, the distance between the left and right tree stems in a row is assumed to be approximately 6m. The U-turn maneuver is implemented as follows: 1. When the vehicle has crossed the last tree, note the heading of the Turn the steering com Move forward for approximately 0.5m. 4. Turn the steering completely to the other Monitor the heading angle. Monitor for obstacl 236

PAGE 237

6. When the vehicle has turned by more than 150 degrees, switch to alleyway navigation mode using vision and ladar. 7. The vehicle has now just entered the new row. Any error in entering the new row is corrected using the alleyway navigation algorithm. The switch-back turning maneuver is implemented as follows: 1. When the vehicle has crossed the last alleyway tree, note the heading of the vehicle using the yaw angle from the IMU. 2. Move forward a distance of 0.5m. This is done to avoid colliding with any long branches of the last alleyway tree. 3. Turn the steering wheel to one end such that vehicle turns in the required direction. 4. Monitor the vehicles heading and monitor for obstacles in front of the vehicle using the ladar. 5. When the vehicle has turned by an angle of 90degrees with respect to the initial heading angle before turning, stop the forward motion of the vehicle. 6. Turn the steering to center position and reverse the vehicle a distance of 0.5m. 7. Turn the steering to the end such that the vehicle turns in the required direction. 8. When the vehicle has turned by more than 150 degrees with respect to the initial heading angle before turning, switch the algorithm to alleyway navigation mode using vision and ladar. 9. Any lateral error in entering the new row is corrected using the alleyway navigation algorithm. It is to be noted that the turning maneuver algorithms are tuned for a grove with certain dimensions of alleyway width. It is operable in groves with dimensions that are different by approximately 1m. But the algorithm is not suitable for groves that differ more. It is difficult to navigate headlands using vision and ladar as headlands can consist of random objects and the vision and ladar could not be modified for all possibilities. During tests in a grove in Immokalee, Florida, it was found that the grove includes a large drainage pit on the ground at the end of each row. In such a case, the turning maneuvers failed as the vehicle ends in the pit. Therefore, more complex ladar based algorithms may be necessary to adapt the vehicle to such possibilities. 237

PAGE 238

Experimental Procedures Experiments in the Test Track All previous experiments with the autonomous vehicle guidance system were conducted using the tractor as the vehicle. In this stage of the project, the guidance system has been adapted for use with a John Deere eGator as the vehicle. The performance of the guidance system on the eGator had to be first tested. Therefore, it was decided that guidance performance tests would be conducted in a test track before testing in an actual grove. A test track was constructed in an open area near Lake Alice in the University of Florida campus. The track consisted of boundaries on either side of a path. The path boundaries were constructed using black flexible plastic cylindrical containers. The test track is shown in the Figure 7-16, with the eGator positioned in the path. Gaps were incorporated in the path boundaries to mimic gaps between citrus trees. Figure 7-16. Vehicle in the test track Although this test track does not ideally mimic the grove conditions, it is a way to ensure reasonable path navigation ability of the guidance system on the vehicle prior to testing in the grove. If testing was directly conducted in a citrus grove and the guidance system fails, damage to trees could occur. Testing in test tracks ensures reasonable operation of the guidance system and minimizes the probability of accidents in the grove. The boundaries of the test path are relatively inexpensive compared to trees. The choice of black objects for the boundary is to aid 238

PAGE 239

the vision algorithm. The threshold value used in the vision algorithm had to be modified to operate in the test track. The ladar algorithm also was modified to detect the test track boundaries. The pure pursuit steering control algorithm however did not require any change as it does not depend on the path. However, these modifications were minimal and not complex. The profile of the test track is shown in the Figure 7-17. Figure 7-17. Profile of the test track for testing the guidance system The length of the path was approximately 53m with an average path width of approximately 2.5m. The profile of the path was chosen such that the vehicle would have to make both a left turn and a right turn to completely navigate the path. This ensures that both left and right turn ability are operational. For the experiments, the vehicle was started in autonomous mode at one end of the path, positioned near the center of the path. The vehicle was manually driven to the starting location, resulting in different initial offset and heading angle for each repetition. Fusion based guidance was used to navigate the vehicle. Experiments were conducted at a vehicle speed of 3.1m/s. The vehicle was allowed to navigate the entire length of the path autonomously and after reaching the end of the path, the vehicle was stopped manually. Three repetitions of the test runs were conducted. Lines were drawn in the path using lime powder. Every time the vehicle navigated the path, the wheel marks were marked on the lines. The 2.5m 23m 15m 15m 110o 110o 239

PAGE 240

vehicles location in the path was detected based on these marks. The path error of the vehicle was measured using a tape measure based on the location of these marks. The measurements were taken at regular intervals of about 3m. Measurement at more frequent intervals was expected to be redundant. After ensuring good performance of the guidance system in navigating the test track, the headland turning experiments were conducted in the citrus grove. The tests in the test track ensured the proper adaptation of the guidance system to the eGator. Experiments in the Citrus Grove The turning maneuvers were tested in a citrus grove in the Pine Acres Experimental Station of the University of Florida in Citra, Florida. Experiments were first conducted to confirm the alleyway navigation performance of the eGator based guidance system. As with the test track, the vehicle was run in autonomous mode in an alleyway of the citrus grove. Lines were marked with lime and the tire marks of the eGator on the lines were used measure the path navigation error. Three repetitions were performed for the test run. The speed of navigation of the vehicle was 3.1m/s. After ensuring good performance of the guidance system in navigating the alleyway, experiments were conducted to test the performance of the headland turning maneuvers. The path width of the alleyways of the citrus grove was about 3.5m and the alleyway length was approximately 110m with about 30 trees on either side. The average tree height was approximately 2m. Figure 7-18 shows one such grove alleyway. Figure 7-18. Alleyway of the citrus grove where experiments were conducted 240

PAGE 241

In the grove, alleyways were chosen for conducting the headland turning experiments. The alleyways were chosen such that the last set of trees in the rows had good canopy which could be detected using the ladar. In a chosen alleyway, the vehicle was started approximately in the middle of the alleyway in autonomous mode. The type of turn, i.e. u-turn or switch back turn, to be executed at the end of the alleyway and the direction of turn, i.e. right turn or left, were specified in the program (Figure 7-19). 1 A B Headland Trees Alley Trees Vehicle Path 8 7 5 6 4 2 3 1 Headland Alley Trees Vehicle Path Headland 9 Figure 7-19. Headland turning m a neuvers in th e g r ove. A) u-turn. B) switch-back turn The vehicle was allowed to naviga te the res t of th e alleyway us ing sensor fusion based steering, detec t the he adland us ing the lad a r, e x ecute the s p ecif i ed tu rn ing m a neuver in the hea d land using the algorithm described above, ge t into the next row and start navigating the next row. The vehicle was m a nually stopped after navigating a distance of approxim a tely 20m in the new row. The path that the vehicle took in the headland was m a rked using m a rkers. The m a rkers were placed m a nually. The m a rkers were p l aced at equ a l angular in tervals of 22.5 degrees w ith th e tree s t um p as the cen ter. This resu lte d in a to tal of 9 m a rkers being placed repres entin g the path taken by the vehicle in the headland. The illu str a tion of the m a rker loca tio ns are num bered in the Figure 7-19. Tw o additio nal m a rkers w e re used f o r the sw itchb ack turn to m a rk the extrem e 241

PAGE 242

forward and backward location of the vehicle before and after reversing respectively. The vehicles average speed was 1.8m/s while executing the turns. This speed was chosen as a reasonable speed to safely navigate the headland and is visually similar to the speed at which a human would drive the vehicle. Experiments were conducted for both left and right turns for each maneuver. Therefore, four maneuvers were performed, namely the left u-turn, right u-turn, left switch back turn and right switch back turn. Three repetitions were performed for each of the runs. After the execution of each test run, the distance of the markers from the trunk of the tree, about which the turn was executed, was measured. This measurement was taken manually using a tape measure. From this measurement and known vehicle width, the distance of the vehicles inner wheel, in the headland, from the same tree stump was calculated. Results and Discussion For the path navigation experiments conducted in the test track and the grove alleyway, the average error, the rms error, standard deviation of the error and the maximum error were calculated and are presented in the Table 7-2. Table 7-2. Performance measures of the eGator based guidance system for path navigation Path Avg. Error (cm) Std. Dev (cm) Max Error (cm) RMS Error (cm) Test Track 1.8 1.2 3.5 2 Grove Alleyway 5 3.5 9 8 Detailed discussion of the fusion based guidance system was presented in Chapter 3. As the path navigation experiments presented here were a confirmation of the adaptation of the guidance system to the new vehicle, only the results are presented. It is observed that the performance of the guidance system using eGator in navigating the paths is similar to the performance when using the tractor. 242

PAGE 243

From the experiments conducted for the headland turning maneuver, the turns were executed around the last tree in the alleyway. The size of the headland is variable and depends on the boundary of the grove. Therefore, it is not practical to specify error measure for headland turning maneuvers. Miller et al. (2004) have reported that that the space required to turn at the headland and the time spent in turning, significantly affect the field efficiency. If more time is spent in tuning in the headlands, the overall time for navigating the grove is increased and this reduces the efficiency of operation. The amount of space required for turning in the headland is dependent on the amount of headland space available. Requirement of more space for turning implies wastage of useful space where more trees could have been planted. However, if the configuration of the grove is such that more space is available, then this space could be used for a faster turning maneuver. The amount of headland space for turning can be related to the distance of the vehicle from the last tree in the alleyway while navigating the headland. Therefore, the average, maximum and minimum distance of the vehicle from the closest alleyway tree and the average time required to execute the turn are reported. The minimum and maximum distances are illustrated in the Figure 7-20. A B Headland Min Max Min Headland Max Figure 7-20. Headland turning m a neuver perform a nce m easures 243

PAGE 244

The measurements were measured and averaged over the three repetitions for each maneuver. The results are given in the tables 7-3 and 7-4. Table 7-3. U-turn performance measures Turn Avg. Dist. (m) Max Dist. (m) Min Dist. (m) Avg. Time (s) Left 3.6 5.2 1.9 Right 3.8 5.3 1.8 14 Table 7-4. Switch back turn performance measures Turn Avg. Dist. (m) Max Dist. (m) Min Dist. (m) Avg. Time (s) Left 2.7 3.2 2 Right 2.7 3.6 1.9 17 From the results it can be observed that the average distance of the vehicles inner wheel from the tree for u-turn is around 3.7m and for switch-back turn is 2.7m. It can be inferred that less radial distance is required from the tree stump to execute a switch back turn than a u-turn. This result is as expected intuitively. The switch back turn maneuver helps to execute the headland navigation within a tighter area than the u-turn. The performance of the vehicle in executing the turn is dependent on several factors such as the minimum radius of curvature of the vehicle steering, the position of the vehicle when exiting the previous row and when entering next row, the operating speed of the vehicle, the canopy size of the trees closest to the headland and the surface gradient of the headland. This difference in radial distance can be attributed to the minimum radius of curvature of the vehicle. As a result of the larger radius of curvature compared to the radius of the turn, the vehicle has to take a wider u-turn to successfully complete the maneuver. This cause can also be attributed for the maximum radial distance of the vehicle, which is higher for the u-turn, which is around 5.3m for the u-turn compared to 3.6m for the switch-back turn. It is also observed that the minimum distance for both turning maneuvers is 244

PAGE 245

similar at around 2m. This is because, the minimum distance occurred at the location when the vehicle just exits the alleyway and begins to execute the headland maneuvers. For both the maneuvers, the vehicle exited the alleyway at approximately 2m distance from the last tree stump. The average time for executing the u-turn is 14s and for the switch-back turn is 17s at an average speed of 1.8m/s. For practical comparisons, the times are similar for both the maneuvers. The time taken to cover a longer distance by the wide u-turn is similar to the time taken in reversing by the switch-back turn. The slightly larger time for the switch-back turn may be attributed to the time taken by the vehicle in stopping and then reversing from rest. Visually, the headland turning maneuvers were similar to human driving. Conclusions Headland turning maneuver algorithms were developed for an autonomous vehicle navigating a citrus grove to execute at the headlands. Algorithms using machine vision and sweeping ladar were developed to precisely detect the headland while the vehicle navigates the alleyways. When the headlands were detected and reached by the autonomous vehicle, U-turn and switch-back turning maneuvers were developed to navigate grove headlands. The performance of the headland detection algorithm and the execution of the turning maneuvers were tested in a citrus grove. Experiments were conducted to test the performance of the turning maneuvers. The average distance from the last alleyway tree, while executing the u-turn was 3.8m and while executing the switch-back turn was 2.7m. Switch back turning maneuver was able to execute the turn within a smaller distance compared to the u-turn. However the average time required to execute the u-turn was 14s and for executing the switch-back turn was 17s. With the addition of the headland detection algorithms and the turning maneuvers to an autonomous vehicle already capable of navigating the alleyways, the autonomous vehicle now possesses the ability to navigate an entire grove. 245

PAGE 246

CHAPTER 8 OPEN FIELD NAVIGATION AND OBSTACLE DETECTION Introduction In the previous chapters, methods were described that allowed the autonomous vehicle to accurately navigate the alleyways of a citrus grove, detect headlands and turn at the headlands using u-turn and switch-back turn maneuvers and navigate subsequent alleyways. With these capabilities, the autonomous vehicle possesses the ability to navigate an entire grove autonomously. In large citrus producing farms, the groves can be arranged in the form of blocks separated by open fields or roadways. With the previously described capabilities, after the vehicle completes navigating a grove block, the vehicle has to be manually driven to another grove block. The vehicle would then autonomously navigate the new grove block. If the vehicle could autonomously navigate the area between grove blocks, the entire farm could be covered without human intervention. When the vehicle navigates the citrus grove alleyways or open fields, obstacles such as animals, humans or broken branches can be expected. When obstacles are present, the vehicle must be capable of detecting them and stopping the vehicle. If possible, the vehicle should also navigate around the obstacle. Within the grove alleyway, there is not much space to navigate around obstacles, whereas it might be possible in the open fields. The development of obstacle avoidance capability is beyond the scope of this project. This chapter discusses the development of open field navigation capability to navigate the open field between grove blocks and the development of obstacle detection capability. 246

PAGE 247

Materials and Methods Open Field Navigation Open field refers to the open area available between grove blocks. The open field is expected to contain uneven terrain, random placement of trees and fences, and presence of objects such as cows. The vision based navigation and ladar based navigation methods described earlier were developed such that they could segment trees from the ground and navigate between trees. In open fields, there is no fixed placement of objects. The random nature of objects in open fields or lack of objects makes it difficult to develop vision or ladar based algorithms for navigating open fields and reach a different grove block. A global solution is required to solve this problem. GPS is an excellent global method for navigating open fields. Therefore, GPS based navigation for traversing open fields is proposed. A differential GPS (DGPS) receiver was available in the laboratory from a previous research on determining the tractor dynamics and is described in Chapter 3. GPS based navigation has been widely used by several researches for guiding autonomous vehicles through crop rows. The DGPS receiver available has a manufacturer specified accuracy of approximately 12inches. The receiver is also capable of operation at 5Hz with a DGPS fix i.e. enough satellite signals and the local station correction signal is available for high accuracy operation. For navigation based on GPS, the path that the vehicle must take should be specified as way points apriori. Using these points, the vehicle, with the aid of the GPS, can follow the points to reach the next grove block. The way points for this purpose would be the latitude and longitude coordinates of way points obtained using the DGPS receiver. When operating autonomously, the guidance system must be capable of using the latitude and longitude information to steer the vehicle to the required way point. 247

PAGE 248

The important component of the algorithm consists of determining the steering angle of the vehicle to get the vehicle from the current location to a required location. The capability to steer to consecutive points can be implemented in the software. Consider the vehicle located at a point A with latitude latA and longitude lonA. Similarly, consider a way point B with coordinates latB and lonB. Suppose the vehicle is required to travel to B from A, it is necessary to determine the distance between the locations and the direction of location B with respect to location A. The direction can be specified with the geographic north-south direction as reference. To determine these, a navigation formula called Haversine formula, was used. The Haversine formula is an equation to determine great-circle distance between two points on a sphere. It is an adaptation of the more general spherical trigonometric formulas. To use the Haversine formula, considering the earth as a sphere, assumptions are important to obtain results with good accuracy. For example, when the points are situated at a large distance compared to the radius of curvature of the earth, the formula considering the earth as a sphere gives good accuracy; when the distance between the points is significantly small compared to the radius of the earth, the assumption of a flat earth can give accurate results and reduce computation; when the points are located on either side of hills or mountains, modifications have to be made to obtain accurate results. For this project, it is assumed that the way points would be located at a much smaller distance compared to the radius of the earth and no significant elevation different is present between the points. Therefore, the Haversine formula, modified for short distances was used. The Figure 8-1 shows the vehicle at A and a way point B where the vehicle must reach. 248

PAGE 249

N Way point B C d Theta m Vehicle A Figure 8-1. Illustration of vehicle navigating way points The vehicle is heading in a direction towards a random point C with a heading angle m with respect to the geographic north. The straight line distance required to reach B is d and a turn through an angle theta+m is required to reach B. The Haversine formula gives d and theta as d = R p (Eq. 8-1) where R = radius of curvature of earth = 6378140m p = 2 Tan-1 ( a )1(a ) (Eq. 8-2) a = Sin2( 2dlat ) + Cos(latA) Cos(latB) Sin2( 2dlon ) (Eq. 8-3) dlat = latB latA (Eq. 8-4) dlon = lonB lonA (Eq. 8-5) theta = mod [atan2 {Sin(dlon) Cos(latB), Cos(latA) Sin(latB) Sin(latA) Cos(latB) Cos(dlon)}, 2 ] (Eq. 8-6) theta is positive for directions east of the geographic north and increases anti-clockwise from 0 to 2 in radians. The direction theta is the heading from location A to location B. To steer the vehicle, it is also necessary to determine the current heading of the vehicle with respect 249

PAGE 250

to the geographic north. This information was obtained from the IMU mounted on the vehicle for other algorithms. The IMU gives the heading of the vehicle with respect to the geographic north as m. The vehicle should be steered such that the angle m is the same as the angle theta. It is to be noted that the DGPS receiver is operating at a rate of 5Hz and the IMU is operating at a rate of approximately 40Hz. An assumption in developing the algorithm is that the vehicle is operating at a speed of less than 4m/s. This speed limitation is based on the 5Hz refresh rate of the DGPS receiver and was determined experimentally by trial and error. As the vehicle is steered in the proper direction, the distance between A and B also reduces until the vehicle reaches the location B. The steering angle obtained by the above formulas is in radians. In the development of the pure pursuit steering described in the Chapter 3, a protocol was used such that the PC sent the error of the vehicle in pixel coordinates to the PC104 computer through serial communication and the error was converted to steering angle by the pure pursuit algorithm in the PC104 computer. Therefore, to use the same protocol, the steering angle obtained above has to be converted to pixel coordinates before sending to the Pure Pursuit steering controller. The physical range of the steering angles of the vehicle is approximately -30 degrees to +30 degree from left to right respectively with 0 degree as the center. From the above equations, the steering angle theta+m has a range of 0 to It should be noted that the mathematical calculations can provide a range of 0 to 2 Considering the examples that a turn of 2 is the same as 0 and for example a turn of 1.5 is the same as turn of 0.5 in the opposite direction, the effective range can be reduced to 0 to for steering. In 250

PAGE 251

degrees, this is the same as 0 to 180 degrees. This can also be expressed as -90 degree to +90 degree. Comparing the vehicles physical steering angle range and the required steering angle range, the conversion is from 0 to 180 degrees with 90 as center to 0 to 60 degrees with 30 as center. This is illustrated in the Figure 8-2. 0 0 + 30 30 Turn of steering 90 + 90 Turn of vehicle Figure 8-2. Angle ranges of steering and vehicle turn A linear relationship is formed between the ranges such that -30 to +30 degrees of the required vehicle turn is mapped to -30 degrees to +30 degrees of the physical steering angle of the vehicle. If this one to one relationship is made, the range from -31 degrees to -90 degrees if the vehicles turn and +31 degrees to +90 degrees of the vehicles turn should also be mapped. The range of -90 degree to -31degree of the vehicle turn is mapped to a single value of -30 degree of steering angle and the range +31degree to +90degree of the vehicle turn is mapped to a single value of +30 degrees of steering angle. This can be observed in the Figure 8-2. 251

PAGE 252

From the previous sections, it is known that the image width used for vision is 400 pixels. To maintain the communication protocol with the PC104 computer algorithm, the range of -30 degrees to +30 degrees is mapped to 0 to 400 pixels of image coordinates. Pixel value = 400 (steering angle/60) (Eq. 8-7) This pixel value of the required steering is sent to the Pure Pursuit steering control. As described in the algorithm above, when suitable steering is executed, the distance between the vehicle and the way point reduces and the vehicle reaches the way point. It is known that the accuracy of the DGPS receiver is 12 inches. During initial developments of the algorithm, the vehicle was determined to have reached the way point, when the distance between the vehicle and the way point was zero. It was observed that the uncertainties in the accuracy of the receiver, the terrain variation and small inaccuracies in steering often result in the vehicle reaching a point very close to the way point of the order of centimeters away from the way point. In such cases, if the algorithm requires that the distance be zero, to translate the vehicle by a few centimeters to the way point, the vehicle has to execute circles or spirals to get to the way point. This is illustrated in the Figure 8-3. Path Vehicle Way Point Translate Figure 8-3. Path taken for translation of the vehicle to the way point 252

PAGE 253

An accuracy of exactly reaching the way point is not practical in the physical system. Therefore, by trial and error in tests, it was chosen that if the vehicle reaches a point within a distance of 50cm from the way point, the way point is determined to have been reached by the vehicle. Obstacle Detection The obstacle detection capability should allow the vehicle to detect obstacles such as humans, animals and location variable objects such as vehicles, while navigating open fields. For the scope of this dissertation, a simple obstacle detection method using ladar was developed. It is assumed that the way points for GPS based navigation were chosen such that the vehicle would not be expected to get tree limbs or fixed objects such as thin poles in its path. For the development of the algorithm for detecting obstacles, the first step was to analyze the data from the ladar while navigating the open field. As mentioned earlier, the ladar scans a plane from 0 degree to 180 degrees and obtains radial distance to objects in front of the ladar. The radial data is converted to Cartesian coordinates using the geometrical formula Frontal distance = (radial distance)(Sin (scan angle)) (Eq. 8-8) Lateral distance = (radial distance)(Cos (scan angle)) (Eq. 8-9) The same formula was also used in the algorithm for ladar based grove alleyway navigation. The Figure 8-4 shows the plot of the frontal distance obtained from the ladar against the scan angle when there is no obstacle in front of the vehicle. From the chart, it is seen that the closest distance in front of the vehicle at 90 degree angle corresponds to the ground directly in front of the vehicle. The ground corresponds to a distance of approximately 6m frontal distance from the ladar. For scan angles away from the central angle of 90 degrees, the distance corresponds to ground locations away from the center of the vehicle. 253

PAGE 254

050010001500200025003000020406080100120140160180200Scan Angle (deg.)Radial Distance (cm) Ground Figure 8-4. Ladar data when obstacle is not present in front of the vehicle The Figure 8-5 shows the scan data from the ladar, when an obstacle is present directly in front of the vehicle. From the data, it is observed that for most of the angles, the ladar data is identical and this corresponds to open field. However, at scan angles of 85 degrees to 95 degrees, the frontal distance drops down to approximately 450cm from the ground data which was approximately 6m in the previous scan shown above. This large change in data corresponds to the presence of an obstacle in front of the vehicle. This can also be interpreted as the obstacle cutting the laser to the ground, thereby showing a smaller distance (Figure 8-6). 050010001500200025003000020406080100120140160180200Scan Angle (deg.)Radial Distance (cm) Obstacle Figure 8-5. Ladar scan data with obstacle present in front of the vehicle 254

PAGE 255

No Obstacle Obstacle 4m 6m Obstacle Vehicle Vehicle Ground Figure 8-6. Illustration of the obstacle detection using ladar The obstacle detection algorithm is based on the above analysis of data described above. The algorithm is as follows: Prior to operation in autonomous mode, when the vehicle is stationary, obtain the ladar scan radial distance corresponding to 90 degree. Determine this distance as the ground distance from the ladar. Operate the vehicle in autonomous open field navigation mode. Monitor the ladar data continuously and note sudden drops of more than 50cm from the ground distance. If a drop is encountered at an angle, move a data window of length 5 through the 5 subsequent data. If all of these data are also less than 50cm from the ground distance, determine the presence of an obstacle. If an obstacle is determined, stop the vehicle. Continue monitoring the scan data for the presence of obstacle. If no obstacle is found, continue navigating using GPS based navigation. The stop signal is sent to the PC104 computer as a special stop command to reduce the speed to zero. No change is made to the steering control. The algorithm uses a data window of 5 data points to detect obstacles. This corresponds to an angle of 3 degrees. It was determined experimentally by trial and error that an angle of 3 degrees translates to sufficient width to determine the presence of an obstacle such as a human. Wider obstacles such as cows are 255

PAGE 256

expected to be determined. The threshold distance of 50cm was chosen so that small terrain elevation changes are not detected as obstacles. The angular width of 3 degrees may not be sufficient to detect objects such as poles. This however is useful in preventing the detection of tall grass as obstacle. The flowchart of the algorithm is shown in the Figure 8-7. Start the vehicle in manual mode Obtain the data corresponding to 90 deg r ees as g r ou n d d i sta n ce Operate vehicle in autonomous mode and monitor ladar data Is any data <50cm from ground distance? No Is 5 consecutive data <50cm from ground distance? Determine obstacle. Sto p vehicle. Figure 8-7. Flowchart of the obstacle detection algorithm 256

PAGE 257

Experimental Methods GPS Based Navigation Experiments were conducted to test the performance of the GPS based guidance to navigate open fields. For conducting the experiments, an open area was chosen near Fifield Hall in the University of Florida campus. The open field was an uneven terrain covered with grass. In navigating the open areas between groves, it is expected that the GPS coordinates would be provided apriori as way points for the vehicle to follow. The guidance system is developed to navigate citrus grove alleyways only. The vision or ladar algorithms are not capable of navigating unknown regions in open fields. Therefore, the distances between way points should be such that only straight line travel would be necessary between way points. Therefore, the way points were chosen approximately 13m to 16m apart, representative of distances requiring only straight line travel between way points. The exact distance between the way points were not controlled. Marker flags were manually placed on the ground representing the approximate location of the way points. The way point locations were chosen such that the vehicle would be required to travel in different directions to complete the navigation. This way, direction based errors from the IMU and the GPS could be included in the tests. The location of the way point flags for the experiment is shown in the Figure 8-8. The flags are indicated by the small circles in the figure. 1 2 6 5 Figure 8-8. Waypoint location and vehicles path for the GPS based navigation 3 4 7 8 Eas t Vehicle Path Star t 257

PAGE 258

The way point flags are numbered based on the sequence in which the vehicle is required to navigate the way points. The illustration of the path taken by the vehicle is shown using dashed arrow lines in the figure. It is to be noted that the way points are located beside the way point flags and not exactly at the location of the flags. This was done to prevent the vehicle from driving over the flags. The location of the geographic east is also shown in the figure. For the test runs, the start point was located some random distance west of the last or 8th way point. The experimental procedure followed was as follows 1. Prior to the start of the experiments, the GPS receiver was turned on for approximately 15min until it acquired enough satellites to obtain DGPS fix and provide accurate data. 2. Collect way points in the form of GPS latitude and longitude coordinates: The DGPS receiver in the vehicle was turned on to receive the GPS coordinates using the software. The vehicle was manually driven to a random location beside the first way point flag and stopped. The software was used to save the latitude and longitude coordinates of a point in a text file. The vehicle was driven to random locations near subsequent way point flags, stopped and the latitude and longitude were saved on the same text file. The sample coordinates saved in the text file is shown below. 29.6444675,82.3460763 Way point 1 29.6444933,82.3464116 Way point 2 29.6444341,82.3467433 Way point 3 29.6446468,82.3467340 Way point 4 29.6446616,82.3464736 Way point 5 29.6446500,82.3461150 Way point 6 3. The coordinates were comma delimited. The saved coordinates were in degrees as converted in the software. At each way point saving location, the location of the center of the vehicles width, directly below the DGPS receiver was marked on the ground using a colored marker. This represented the exact physical location of the way point. 4. Provide the way points in a text file for the guidance system to read: The text file with the latitude and longitude coordinates of the way points was given as input to the navigation software. 5. Pick a start point and start the vehicle in autonomous GPS based navigation mode: The vehicle was manually driven to a random location a few meters west of the last way point. 258

PAGE 259

The vehicle was stopped. The autonomous GPS based navigation algorithm was run in the software. The vehicle was switched to the autonomous mode and allowed to navigate. 6. Allow the vehicle to follow the way points until it reaches the last way point: The vehicle was allowed to navigate autonomously and navigate the way points in the numbered sequence mentioned earlier. As the vehicle reached each way point, the location of the center line of the vehicles width, when it is closest to the marked way point, was marked on the ground using colored marker. 7. The vehicle was programmed to stop motion after reaching the last way point. 8. The distance between the marked way point and the newly marked navigated point was measured manually using a tape measure. This distance represents the error of the vehicle in reaching the way point. This was performed for all the way points. 9. After taking measurements, the markers were removed. Three test runs were conducted for repeatability. The same way points were used for the three repetitions. The average speed of the vehicle during the test runs was 1.8m/s. Obstacle Detection Experiments were conducted to verify the operation of the obstacle detection algorithms ability to detect obstacles. Two different types of obstacles were chosen, a human of height approximately 1.8m and a cubical box of height approximately 0.9m. In this experiment, the vehicle was started in autonomous mode from a stationary position and allowed to travel using GPS based navigation from one point to another. In one experiment, on the path of the vehicle, the box was placed stationary and the vehicle was allowed to navigate the path, detect the obstacle and stop until the box was removed. In a second experiment, the vehicle was allowed to navigate as in the first experiment. A human walked slowly in the path of the vehicle. The vehicle was allowed to navigate, detect the human obstacle and stop until the walking human moved out of the range of the vehicles path. The experiment was to confirm the obstacle detection capability. The distance of the vehicle when stopping for the obstacle was observed. Three repetitions were performed by placing the obstacles at random locations in the path of the vehicle. 259

PAGE 260

Results and Discussion GPS Based Navigation From the experiments, the distance of the vehicle widths center from the way point was reported as the error in reaching the way point. The distances were averaged over the three repetitions and over all the way points and reported as the average error for the GPS based navigation for the experiment. The results are provided in the Table 8-1. Table 8-1. Error of the vehicle for GPS based navigation experiments Way point Trial 1 2 3 4 5 6 7 8 1 31 55 39 21 38 18 46 12 2 45 43 15 6 17 23 54 26 Distance from way point (cm) 3 27 34 24 43 40 31 35 29 From the results, it can be seen that the error is distributed among all the way points. The average error over all way points was 31.75cm. The errors cannot entirely be attributed to navigation of any particular way point or terrain change near one way point. As mentioned earlier, the DGPS receiver is accurate to 12inches or approximately 30cm as specified by the manufacturer. Also, the GPS based navigation algorithm considers a way point as being reached if the vehicle is less than 50cm away from the way point. With the accuracy of the receiver, accuracy less than 30cm cannot be expected for all test runs. Considering the accuracy of the DGPS receiver and the allowable error in the algorithm, the average error of 31.75cm seems to be of good accuracy. However, it should be noted that the accuracy in navigation is variable on several factors such as the number of satellite signals being received by the DGPS receiver, the errors in DGPS measurements due to motion of the vehicle, errors due to uneven terrain and 260

PAGE 261

presence of trees in the vicinity blocking the signal. Some of the low errors of 6cm or some of the high errors of 54cm may be attributed to the vehicle navigating a subsequent way point in a direction of motion resulting in error or to DGPS accuracy at a particular instant of time. Therefore, the individual errors and average error are only an indication of the validity of using the DGPS receiver to navigate the open fields and may not be taken as a measure of accuracy of the guidance system. The errors also indicate the amount of permissible error that is expected to be available for the vehicle to navigate the open fields. While navigating fields between citrus groves, the way points should be chosen to allow sufficient error or if higher accuracy is required, ladar or vision based navigation could be used to augment the guidance performance. Obstacle Detection From the experiments conducted for the obstacle detection, it was observed that the guidance system was able to detect both the obstacles during every test run. Both the obstacles were detected at a distance of approximately 5m in front of the vehicle. The mounting angle of the ladar was such that the ladar scan could detect the ground at approximately 6m. Therefore, when the obstacles crossed the ladar scan prior to the ground, they could be detected. It was also observed that the vehicle remained stopped until the obstacles were removed. These observations confirm the ability of the obstacle detection algorithm to detect obstacles of the size of humans or boxes. It is expected that that animals such as cows or goats would also be detected due to their size. However, the obstacle detection system is not expected to detect tall grasses or thin poles, due to the algorithm specifications. Extensive tests using various types of obstacles are required before the ability of the system to detect different type of obstacles can be confirmed. As the obstacle detection algorithm is based only on ladar, it is uncertain what kind of objects the laser beam can reflect off of. A more robust obstacle detection system could include vision in combination with the ladar. However, addition of vision could cost significant processing time. 261

PAGE 262

Conclusions GPS based navigation method was developed to navigate open fields between grove blocks. The method used way points to navigate the vehicle in small distances. A DGPS receiver was used to collect way point coordinates. The way points were provided as latitude and longitude coordinates. The coordinates were used to steer the vehicle to each of the way points. Experiments were conducted to test the performance of the open field navigation system. The vehicle was able to reach all of the way points with an average error of 31.75cm from the way point location. This accuracy was comparable to the accuracy of the DGPS receiver. An obstacle detection algorithm was developed to detect obstacles while navigating open fields. The algorithm used the ladar for detecting obstacles. Tests were conducted to verify the obstacle detection ability. The algorithm was successfully able to detect human and box obstacles. The algorithm is expected to detect obstacles of the size of cows and animals of similar size. 262

PAGE 263

CHAPTER 9 RESEARCH SYNTHESIS AND OVERVIEW Cost and limited labor availability project robotics and automation as promising methods of operating citrus groves. Autonomous vehicles form an integral part of such an initiative. The development of guidance system for an autonomous vehicle operating in citrus groves was the subject of this dissertation. Previous efforts at developing autonomous vehicles were discussed in Chapter 2. Most of the efforts have been for developing autonomous vehicles for use in crop row navigation. Current commercial systems also target crop rows. This research furthers the development of autonomous vehicles for use in citrus groves. This research was started with a tractor as the base vehicle for the development of the guidance system. The tractor was readily available and suited for the application. The tractor was retro-fitted with an electro-hydraulic valve to enable it to be controlled using electric signals. The retrofit ensured retaining the manual operation of the vehicle. The electro-hydraulic valve was calibrated to determine the voltage and time required to change the steering angle. During the calibration, viscosity of the hydraulic fluid was found to play an important role in the steering time. A rotary encoder was mounted on the wheel shaft and calibrated to determine the steering angle. This provided closed loop control of the steering. The dynamics of the vehicle steering system was determined experimentally. A DGPS receiver was primarily used for the experiments to determine the position of the vehicle. The dynamics of the vehicle provided a mathematical expression for simulating the vehicle for developing the steering control system. PID based steering control system was developed by simulation using the vehicle dynamics model. The steering control was implemented in a microcontroller using C++ for software development. For path determination, machine vision and ladar were chosen as the primary sensors. Previous efforts have used GPS receiver as the primary sensor for navigation. Presence of tall 263

PAGE 264

trees in or around the grove does not encourage the use of GPS based systems in the grove. A video camera was used as sensor for machine vision development. The two primary sensors were mounted on top of the tractor cabin looking ahead of the vehicle. To develop the machine vision algorithms, first a database of images was obtained from the grove. The images were obtained throughout the day. The images accounted for varying cloud conditions and shadow conditions in a grove alleyway over a day. The image database was used to develop the machine vision algorithm. The algorithm used color based thresholding followed by morphological operations to segment trees and the path. An important constraint during the development of the algorithm was the processing time available in the PC. The PC was expected to run the ladar, the IMU, the GPS and all of the high level algorithms. All these processes running in a single processor limited the processor time available for each algorithm. A ladar based path segmentation algorithm was also developed. The algorithm was based on distance of objects in reference to the vehicle. The two methods were able to segment the path and the tree. The developed path determination algorithms together with the steering control system were implemented in software. Experiments were conducted in custom designed test tracks using hay bales as path boundaries. The main aim of the experiments in the test track was to ensure proper operation of the various components of the guidance system including sensor interface, algorithms and the software interface between the components. Experiments were then conducted in citrus grove alleyways to verify navigation of the alleyways by the guidance system. The experiments also helped determine the accuracy of the guidance system in navigating the vehicle at the center of the alleyway. Vision based navigation was found to navigate the path well. The ladar based algorithms developed in the test track were not directly 264

PAGE 265

adaptable to the grove conditions. Filtering of the ladar data was required due to uneven tree canopy. The experiments using independent machine vision and ladar for navigating the grove alleyways provided insights into the capability of these sensors. Machine vision had a long field of view and navigation was not as accurate as the ladar. At the end of the alleyways, machine vision capability was lost due to the longer field of view beyond the alleyway. Ladar had shorter and more accurate field of view but was inaccurate for short durations in the absence of a tree on the alleyway. To utilize the advantages of the sensors, fusion algorithms were developed to fuse the information from these two sensors. Kalman filter was chosen as the primary fusion method. Fuzzy logic was used to select the appropriate sensor for the end of the alleyway and while missing trees are encountered. Inertial measurement unit and a speed sensor were added to aid the fusion process. Experiments were conducted in test tracks to ensure proper operation of the guidance system. Experiments were then conducted in citrus grove alleyways. Fusion based guidance was found to perform better than individual sensor based guidance. The navigation of the center of the path was also more accurate. From the experiments using the tractor, it was found that transporting the tractor to the grove for testing was expensive. To further the abilities of the vehicle, speed control, reversing and start-stop capability was important. A golf cart type vehicle, eGator was donated to the research by John Deere. The eGator was an electric vehicle. The vehicle was suited for easy control of the steering, speed and adding star-stop capability. It was also convenient to transport the vehicle. However, documentation on the vehicle was minimal. The vehicle was reverse engineered to be used for autonomous operation. The entire guidance system was adapted for the 265

PAGE 266

eGator. Pure Pursuit steering control algorithm was adopted for steering the eGator. The method does not require the knowledge of the vehicle dynamics. It was also intuitive to develop. After navigating the alleyway of a citrus grove, the vehicle should be able to enter subsequent alleyways by turning at the headlands. This would provide the vehicle with the ability to completely navigate the grove. Algorithms were developed to turn the vehicle at the headlands of the grove. The algorithm used machine vision for detecting the approach of the headland and sweeping ladar to accurately determine the location of the headland. U-turn and switchback turning algorithms were developed to turn at the headlands. Experiments were conducted to verify the ability to turn at the headlands. The switch-back turn is capable of turning within a smaller area but took slightly more time to execute. The headland turning ability allowed the vehicle to completely navigate a grove block. Often large citrus groves are separated into smaller grove blocks. An autonomous vehicle capable of moving from one grove block to another would be an added capability. The area between the grove blocks is expected to be an open field or roadway. GPS based open field navigation algorithm were developed. The algorithm used latitude and longitude coordinates of way points to navigate the vehicle. In open fields, obstacles are expected. Therefore, obstacle detection algorithms using ladar was developed. The vehicle possesses the ability to stop in the presence of obstacles. Experiments were conducted to verify the ability of the vehicle to navigate open fields using way points and to test the obstacle detection capability. Open field navigation together with obstacle detection enables the vehicle to traverse the area between one grove block to another. The vehicle now possesses the ability to completely navigate a citrus grove. 266

PAGE 267

CHAPTER 10 CONCLUSION AND FUTURE WORK Conclusions The development of the autonomous vehicle guidance for navigating citrus groves contributed to the global body of knowledge on autonomous vehicles both for agricultural and non-agricultural applications. The research specifically contributes to the research efforts in utilizing robotics for citrus grove operations. The development of the autonomous vehicle guidance shows machine vision and ladar based systems as promising methods for grove operations as an important alternative to GPS based guidance systems for crops. The presence of shadows is a problem in many outdoor applications using machine vision. The adaptive machine vision thresholding technique was found to be a method for dealing with the presence of shadows. High accuracy in navigating the center of the grove alleyways was achieved using sensor fusion based guidance. Utilizing the reliability of different sensors for varying environmental conditions, while fusing the sensor information, was an important milestone in improving the reliability of the overall guidance system. Headland turning ability gives the ability to completely navigate the entire grove block. Sweeping ladar based detection of headlands is an important contribution in this effort. GPS based open field navigation and obstacle detection abilities enable the autonomous vehicle to navigate the area between grove blocks. All of the experiments for this development were conducted in one grove. Further efforts are needed in testing the vehicle in a variety of groves and tuning the algorithms to ensure reliability of the autonomous vehicle guidance in all groves. All the objectives of this research were successfully completed. The vehicle currently possesses the ability to completely navigate grove blocks and travel from one grove block to another. 267

PAGE 268

Future Work The development of the autonomous vehicle guidance for navigating citrus groves is presented in this dissertation. Although all the objectives of this research were completed, significant future work is required before the guidance system can be reliably used and commercialized. This research on vehicle guidance encompassed a vast area and includes numerous sub components for the operation of the overall guidance system. Each component of the system invites significant further research. The choice of the vehicle in this research was guided by the availability of the vehicle. Investigation is required to determine the suitability of a vehicle for the application. The choice of control systems namely PID and Pure Pursuit were based on previous research. Other control systems for navigating groves could be explored. The machine vision algorithm presented in this work utilizes color based segmentation of trees and path. The development of the algorithm was limited by the computer processing time available when running multiple algorithms on a PC. Other machine vision methods such as texture and shape of objects would make the segmentation more reliable. This choice is largely guided by the processing power available and optimal programming techniques, which were not the focus of this work. Improved software utilization such as parallel processing could be investigated for this purpose. Use of multiple processors is a suitable option for long term improvement of the guidance system. Windows operating system used in this work runs numerous non-essential operations in the background. This reduces the processing time available for real time operation of the guidance system. A leaner real time operating system should be explored. Obstacle avoidance ability was not pursued in this research. Within the grove alleyways, the amount of space available for the vehicle to navigate is small. It is to be determined whether obstacle avoidance is feasible within the alleyway. For navigating open fields, obstacle detection 268

PAGE 269

capability was included. The system requires the obstacle to be moved from the path for the vehicle to continue navigation. In autonomous operation, movement of obstacles such as animals cannot be expected. Obstacle avoidance ability could be incorporated to navigate around the obstacle. The avoidance algorithm may be required to account for moving obstacles such as moving animals. The variations in the planting of trees differ from grove to grove. The headlands also differ. For example, one particular grove where the autonomous vehicle was tested consisted of deep pits at the headlands for drainage. The headland turning algorithms developed in this work were not capable of maneuvering the pits and the navigation failed. Therefore, all the varying grove conditions need to be captured for improving the algorithms. Rigorous testing in several groves is also required for ensuring reliability. 269

PAGE 270

APPENDIX A CODE SEGMENT FOR PID BASED STEERING CONTROL The relevant sections of the code used in the software for the implementation of the PID based steering controller are presented here. /********************************************************************* PID based steering control for steering the tractor Inputs: Error from the PC through serial communication Outputs: Voltage to the electro-hydraulic valve *********************************************************************/ // e=error,eo=previous error,de=change in error int Kp,Kd,Ki,e,eo,de,errorinteg,Gain; void main(void) { // Microcontroller 586E initialization sc_init(); //Data initialization for(i=0;i<2001;i++) { angle_data[i]=0; time_data[i]=0; } // Decoder Variables initialized clkt_sel(5); // 36.864 MHz hp1=0; hp2=0; //Initialize serial communication c1 = &ser1_com; baud = 9; /* 38400 baud for SER1 */ isize=MAXISIZE; osize=MAXOSIZE; s1_init(baud,ser1_in_buf,isize,ser1_out_buf,osize,c1); delay_ms1(200); // Analog output initialize // Center point of steering dac=38000; 270

PAGE 271

// Various useful voltage values output of the AD converter in microcontroller //0 = 0V, 10000= 0.383V,20000=0.764 //30000= 1.145V, 40000=1.526V,50000= 1.907V //60000=2.288V, 65520=2.499V // Voltage output of amplifier // tern 0 = 3.46 V = 0, tern 1.25= 6.54V = 37620, tern 2.5 = 9.66V = 65520 // Send values out to the ports outport(0x18E0,dac); outport(0x18E2,dac); outport(0x18E4,dac); outport(0x18E6,dac); // decoder operation initialized hp2=inportb(HP2+2)+(inportb(HP2)<<8); // Initial decoder Output value position_center=hp2; // saving the value as center actual_position_current=0; actual_position_previous=0; position_current=hp2; // Initialize PID variables e=0; eo=0; de=0; errorinteg=0; Gain=0; // Start the control loop while(1) { j=0;j1=0;disp=0; ptm=0; Gain=0; hp2=inportb(HP2+2)+(inportb(HP2)<<8); // Decoder Output if(hp2>=200 || hp2<=-200) // Limits of decoder { hp2=hp2prev; } hp2prev=hp2; // Serial data received from PC if(serhit1(c1)) { 271

PAGE 272

m=getser1(c1); // Get the start of data character if(m==112) { if( serhit1(c1) ) { // sent by PC j = getser1(c1); // get the 1st character } if( serhit1(c1) ) { // hit by PC1 j1 = getser1(c1); // get the 2nd character } if(serhit1(c1)) { j2=getser1(c1); // get the 3rd character } if(serhit1(c1)) { j3=getser1(c1); // get the 4th character } // Reforming the data from the characters received if(j!=45 && m==112) // Right turn steering { if(j2!=0) { disp=(j-48)*100+(j1-48)*10+j2-48; } else if(j1!=0) { disp=(j-48)*10+j1-48; } else { disp=j-48; } } else if(m==112) // left turn steering { if(j3!=0) 272

PAGE 273

{ disp=-((j1-48)*100+(j2-48)*10+j3-48); } else if(j2!=0) { disp=-((j1-48)*10+j2-48); } else { disp=-(j1-48); } } } } delay_ms1(30); //30ms delay // PID control if(disp!=0) // non-zero error disp { //if disp is +, move right, if -, move left e=disp; Kp=1; // proportional gain Kd=4; // Derivative gain Ki=0.09; // Integral gain de=e-eo; // change in error eo=e; // update previous error errorinteg=errorinteg+e; // integral error Gain=Kp*e+Kd*de+Ki*errorinteg; // PID gain ptm =0.5*Gain *140/20; //Angle to Pulse conversion: 20 degrees = 140pulses ptm=ptm+hp2; // change in steering angle //Limits for steering if(ptm>180) { ptm=180; } else if(ptm<-180) { ptm=-180; } 273

PAGE 274

//Angle to voltage conversion if(ptm>0 && ptm<50) { dac=45000+100*ptm; if(dac>50000) { dac=50000; } } else if(ptm>50) { dac=50000+30*ptm; } else if(ptm<0 && ptm>-50) { dac=29000+100*ptm; if(dac<24000) { dac=24000; } } else if(ptm<-50) { dac=24000+30*ptm; // Output voltage to Valve } else { dac=38000; } } // if no steering is required else if(ptm==0) { dac=38000; // straighten steering } //Voltage Limiter---------------------------------// Voltage is limited to maximum values for safe operation if(dac>55000) {dac=55000;} else if(dac< 19000) {dac=19000;} 274

PAGE 275

//Output voltage to Tern---------outport(0x18E0,dac); outport(0x18E2,dac); outport(0x18E4,dac); outport(0x18E6,dac); } // Reset steering to center before quitting dac=38000; outport(0x18E0,dac); outport(0x18E2,dac); outport(0x18E4,dac); outport(0x18E6,dac); } 275

PAGE 276

APPENDIX B CODE SEGMENT FOR VISION BASED PATH SEGMENTATION /* Code segment for vision based path segmentation Function: Imagproc() Input: Pointer to image data in memory Output: Lateral error of vehicle in the path Calling Function: OnPaint() Variables: ProcDib: Memory pointer for image data PW: Window size */ //************************************************ // Thresholding assuming presence of shadows //************************************************ for(int i=1;i<=240;i++) for( int j=1;j<=640;j++) if( *(ProcDib+41+i*1920+j*3)<=45) { *(ProcDib+41+i*1920+j*3)=150; *(ProcDib+41+i*1920+j*3+1)=0; *(ProcDib+41+i*1920+j*3+2)=0; if(j<320) left_count=left_count+1; else right_count=right_count+1; } //******************************************** // If non-shadow image, new thresholding //******************************************** if(left_count<7000 && right_count<7000) { for(int i=1;i<=240;i++) for( int j=1;j<=640;j++) if( *(ProcDib+41+i*1920+j*3)<=80) { *(ProcDib+41+i*1920+j*3)=150;//G *(ProcDib+41+i*1920+j*3+1)=0;//R *(ProcDib+41+i*1920+j*3+2)=0;//B } } 276

PAGE 277

//******************************************* // Moving a 20*20 window for painting regions //****************************************** for(int i=1;i<240;i+=20) for(int j=1;j<640;j+=20) { int countg=0; for(int k=0;k<20;k++) for(int l=0;l<20;l++) { if(*(ProcDib+41+(i+k)*1920+(j+l)*3)==150 && *(ProcDib+41+(i+k)*1920+(j+l)*3+1)==0 && *(ProcDib+41+(i+k)*1920+(j+l)*3+2)==0) countg=countg+1; } if(countg>40) { for(int k=0;k<20;k++) for(int l=0;l<20;l++) { *(ProcDib+41+(i+k)*1920+(j+l)*3)=150; *(ProcDib+41+(i+k)*1920+(j+l)*3+1)=0; *(ProcDib+41+(i+k)*1920+(j+l)*3+2)=0; } } } //******************************************** //Segmentation //Painting regions to remove salt and pepper noise //and can be used for determining image with lot of shadow //******************************************** #define PW 15 //window size 15 for(int h=MyHeight;h>=0;h-=PW) { for(int w=0;w
PAGE 278

for(int l=0;l50) { for(int k=0;k
PAGE 279

fitter[left][h]=0; // clear left fitting start=0;finish=0; for(int j=MyWidth/2;j>=0;j--)//middle to left end { u = ((h*MyWidth)+j)*3; //if you see white if(start==0 && *(ProcDdb+u+blu)==255) { start=j; } // if you see black after seeing white if(start!=0 && *(ProcDdb+u+blu)==0 && finish==0 ) { finish=j-1; fitter[left][h]=finish; if(left_startx==-1) {left_startx=finish;left_starty=h;} if(left_endx!=-1) {left_endx=-1;} } } if(left_starty!=-1 && fitter[left][h]==0 && left_endx==-1) { left_endx=fitter[left][h-1];left_endy=h-1;} } //----Right side wall--------------------------------for(int h=MyHeight/2-100;h<=MyHeight-100;h++) { fitter[right][h]=MyWidth; // clear right fitting start=0;finish=MyWidth; for(int j=MyWidth/2;j<=MyWidth;j++) { u = ((h*MyWidth)+j)*3; if(start==0 && *(ProcDdb+u+blu)==255) { start=j; } if(start!=0 && *(ProcDdb+u+blu)==0 && finish==MyWidth) 279

PAGE 280

{ finish=j; fitter[right][h]=finish; if(right_startx==-1) {right_startx=finish;right_starty=h;} if(right_endx!=-1) {right_endx=-1;} } } if(right_starty!=-1 && fitter[right][h]==MyWidth && right_endx==-1) { right_endx=fitter[right][h-1];right_endy=h-1;} if(left_endy
PAGE 281

if(fitter[left][i]==0 && fitter[left][i-1]!=0) { fitter[left][i]=fitter[left][i-1];} if(fitter[right][i]==400 && fitter[right][i-1]!=400) { fitter[right][i]=fitter[right][i-1];} } // goto skipper; GetBoundary(left); // Line fitting left boundary GetBoundary(right); // Line fitting right boundary for(int h=MyHeight/2-100;h<=MyHeight-100;h++) { x[left]=al[left]+bl[left]*h; wall[left][h]=x[left]-50; if(x[left]>0 && x[left]<400) { if(brick[left][0]==-1) { brick[left][0]=wall[left][h]; } else { brick[left][1]=wall[left][h]; } } x[right]=al[right]+bl[right]*h; wall[right][h]=x[right]+60; if(x[right]>0 && x[right]=0 && wall[right][h]<=MyWidth) 281

PAGE 282

{ fitter[middle][h]=(wall[right][h]-wall[left][h])/2; // distance between walls fitter[middle][h]=wall[left][h]+fitter[middle][h]+40; // centered between left & right walls if(wall[left][h]==0 && wall[right][h]==MyWidth) { fitter[middle][h]=fitter[middle][h-1];} //if only one side wall is visible, travel at 4ft=108pix from the visible wall else { if(wall[left][h]>0 && wall[right][h]>=MyWidth) { fitter[middle][h]=wall[left][h]+208; } else if(wall[left][h]<=0 && wall[right][h]0) { if(brick[middle][0]==-1) { brick[middle][0]=x[middle]; } else { brick[middle][1]=x[middle]; } } } // Line for true center------------int w=0; w=(brick[middle][0]+brick[middle][1])/2; ppp=w; 282

PAGE 283

for(int h=(MyHeight-100-20);h<=(MyHeight-100+20);h++) { // used for error calculation u = ((h*MyWidth)+w)*3; *(ProcDdb+u+blu)=200; // blue *(ProcDdb+u+grn)=200; // green *(ProcDdb+u+red)=0; // red } skipper: return 0; } /* Function: GetBoundary Calling function: Imagproc() Input: Array of numbers Output: Line fitting for the numbers */ void GetBoundary(linpos lor) { float sqxl,sxl,pxyl,scl,nral,nrbl,dr; sqxl=0;sxl=0;pxyl=0;noxl=0;scl=0; for(int i=MyHeight/2-100;i<=MyHeight-100;i++) { if(fitter[lor][i]!=0 && fitter[lor][i]!=MyWidth) { noxl++; sqxl=sqxl+i*i; sxl+=i; pxyl=pxyl+i*fitter[lor][i]; } } for(int i=MyHeight/2-100;i<=MyHeight-100;i++) scl=scl+fitter[lor][i]; nral=scl*sqxl-sxl*pxyl; dr=noxl*sqxl-sxl*sxl; nrbl=noxl*pxyl-sxl*scl; al[lor]= (float)nral/dr; bl[lor]= (float)nrbl/dr; brick[lor][0]=-1; brick[lor][1]=0; } 283

PAGE 284

APPENDIX C CODE SEGMENT FOR LADAR BASED PATH NAVIGATION /* Function: Ladar_Guidance() Calling function: OnLadarData() Input: Pointer to raw data location Output: Lateral error of vehicle in the path */ void Ladar_Guidance() { ladarflag=true; //Find horizontal and vertical distance of environment // cosine gives the lateral distance // sine gives the frontal distance for(int i=1;i0+10;i--) { if(y_dist[i]
PAGE 285

tree_left_sighted=true; //Check if the inference is true //Check if 5 succesive points agree with the inference for(int k=1;k<=5;k++) { if(y_dist[i-k]>grnd-50) { tree_left_sighted=false; //disagree break; } } //If tree found, stop checking left if(tree_left_sighted==true) //decided { left_wall_position=i; break; } } } //Scan for right wall grnd=y_dist[180];tree_right_sighted=false;right_wall_position=180; for(int i=180;i<361-10;i++) { if(y_dist[i]grnd-20) { tree_right_sighted=false; //disagree break; } } //If tree found, stop checking right 285

PAGE 286

if(tree_left_sighted==true) //decided { right_wall_position=i; break; } } } //Distance from the two walls //note: right wall is actual left wall, since scan is reverse in ladar angle= (float)left_wall_position/2; distance_from_right_wall=fabs(SIC.MVA[left_wall_position]*cos(float (angle)*3.1428/180)); angle= (float)right_wall_position/2; distance_from_left_wall=fabs(SIC.MVA[right_wall_position]*cos(float(angle)*3.1428/180)); // Lateral error finding lateral_error = distance_from_left_wall-distance_from_right_wall; ppl=200-0.74*lateral_error; //74.07=180pix/2.43m to convert to pixels for pure pursuit //error will be negative if right turn is to be made and vice versa //When there are not trees on both sides or one side if(distance_from_left_wall>300 && distance_from_right_wall>300) //no tree both side { ppl=0; } else if(distance_from_left_wall<300 && distance_from_right_wall>300) //no tree right side { lateral_error=distance_from_left_wall-225; ppl=200-0.74*lateral_error;} else if(distance_from_left_wall>300 && distance_from_right_wall<300) //no tree left side { lateral_error=225-distance_from_right_wall; ppl=200-0.74*lateral_error;} else if(distance_from_left_wall<80 && distance_from_right_wall<80) ErrorL=lateral_error/2; } } 286

PAGE 287

APPENDIX D CODE SEGMENT FOR SENSOR FUSION BASED ALGORITHMS /* Code segment for sensor fusion Function: Fusion() Calling function: OnPaint() if both vision and ladar are operational Input: Measurement values Output: Lateral error of vehicle in pixel coordinates, ErrorF Variables: ErrorL: Error from ladar ErrorI: Error from vision Theta_imu: Change in heading Yaw: Heading angle measurement m_Z: Measurement vector Z m_R: Measurement noise covariance matrix R m_A: State transition matrix A T : Time m_Q: Process noise covariance matrix m_X: State vector m_H: Observation matrix m_temp: temporary matrix for storing data m_K: Gain m_P: State estimate error covariance matrix */ void Fusion() { //Initialization of heading angle if(ErrorL==0 && ErrorI==0 && theta_imu_init==0) { theta_imu_init=0; } if(theta_imu_init==0 ) { theta_imu_init=yaw; //theta_init=initial yaw angle=initial vehicle angle } //change in heading theta_imu=yaw-theta_imu_init; //actual yaw w.r.t initial angle //Account for negative heading angles if(theta_imu<0) {theta_imu=90+theta_imu;} else {theta_imu=90-theta_imu;} 287

PAGE 288

//Z array measurement vector inputs m_Z.m_Value[0][0]=ErrorI; m_Z.m_Value[1][0]=ErrorL; m_Z.m_Value[2][0]=atan2(float(heading2-heading1),60); m_Z.m_Value[3][0]=theta_imu*pi/180; m_Z.m_Value[4][0]=speed; if(m_Z.m_Value[1][0]==0) { m_R.m_Value[0][0]=0.05; } //Reliability factor of sensor from fuzzy logic implementation m_R.m_Value[0][0] = f_vision; m_R.m_Value[1][1] =f_ladar; if(abs(m_Z.m_Value[1][0])>50 && abs(m_Z.m_Value[0][0])<50) { m_R.m_Value[1][1]=55; //high ladar measurement noise } else //otherwise { m_R.m_Value[1][1]=0.15; //low ladar measurement noise } m_A.m_Value[0][2]= T*sin(m_Z.m_Value[3][0]); // Time update equations //Xe=A*X; m_Xe=m_A*m_X; //Pe=A*P*A.'+Q; m_temp=m_A; m_At.MatTranspose(m_A); m_temp=m_A*m_P; m_temp=m_temp*m_At; m_Pe=m_temp+m_Q; //Measurement update equations //K=Pe*(H.') *pinv(H*Pe*(H.')+R); m_temp1=m_H*m_Pe; // H transpose for(int r=0;r<3;r++) 288

PAGE 289

for(int c=0;c<5;c++) m_Ht.m_Value[r][c]=m_H.m_Value[c][r]; //m_temp2=m_temp1*m_Ht; //Asymmetric matrix multiplication for(int r=0;r<5;r++) for(int c=0;c<5;c++) { m_temp2.m_Value[r][c]=0; for(int i=0;i<3;i++) m_temp2.m_Value[r][c]+=m_temp1.m_Value[r][i]*m_Ht.m_Value[i][c]; } m_temp2=m_temp2+m_R; m_temp2=m_temp2^-1; //m_temp3=m_Pe*m_Ht; for(int r=0;r<3;r++) for(int c=0;c<5;c++) { m_temp3.m_Value[r][c]=0; for(int i=0;i<3;i++) m_temp3.m_Value[r][c]+=m_Pe.m_Value[r][i]*m_Ht.m_Value[i][c]; } m_K=m_temp3*m_temp2; //X=Xe+K*(Z-H*Xe); m_temp4=m_H*m_Xe; m_temp4=m_Z-m_temp4; m_temp5=m_K*m_temp4; m_X=m_Xe+m_temp5; //P= (I-K*H)*Pe; m_temp=m_K*m_H; m_temp=m_I-m_temp; m_P=m_temp*m_Pe; ErrorF=-m_X.m_Value[0][0]; // New fusion based error // Innovation vector // m_Z = m_Z m_H*m_X m_temp6 = m_H m_X; m_Z = m_Z m_temp6; //Update process noise m_Q[0][0] = f_x; m_Q[2][2] = f_thetaR; } 289

PAGE 290

APPENDIX E CODE SEGMENT FOR PURE PURSUIT STEERING /* Code segment for Pure Pursuit steering Input: Lateral error of vehicle in pixels Output: Steering angle to the CAN controller */ // Right turn command // sthi: Lateral error in pixel coordinates if(sthi>200 && sthi!=402 && sthi!=404) { // Amount of turn x_pix=sthi-200; //Pixel to distance coordinates in m x=0.0135*x_pix; // Compute radius of curvature r=(x/2)+(0.186/x); // Compute steering angle steer=(int)(-3.27*r+44.38); if(steer<=0) { steer=0; } steer=-steer; } // Left turn command else if(sthi!=402 && sthi!=404) { x_pix=200-sthi; x=0.0135*x_pix; //0.0135=2.43/180 r=(x/2)+(0.186/x); steer=(int)(-5.8*r+52); if(steer<=0) { steer=0; } } 290

PAGE 291

APPENDIX F CODE SEGMENT FOR HEADLAND TURNING /* Code segment for headland turning Functions: treeladarprocess,HeadlandU(), Headland3pt() Output: Turning information in pixel error */ // Sweeping ladar to determine end of alleyway // Input: ladar data // Output: Distance to headland void treedladarprocess() { //conversion from 3d polar coordinates to 3d cartesian coordinates int k=1; for(int i=0;i200) {las_x[k]=0;las_y[k]=0;las_z[k]=0;} else if(las_y[k]>3000) {las_x[k]=0;las_y[k]=0;las_z[k]=0;} else if(las_z[k]<-150) {las_x[k]=0;las_y[k]=0;las_z[k]=0;} else if(las_x[k]>500) {las_x[k]=0;las_y[k]=0;las_z[k]=0;} else if(las_x[k]<-500) {las_x[k]=0;las_y[k]=0;las_z[k]=0;} 291

PAGE 292

k=k+1; } //3d cartesian coordinates to 2d cartesian grid int a,b; int (*location)[3005]; location=new int[1005][3005]; for(int i=1;i<=1000;i++) for(int j=1;j<=3000;j++) location[i][j]=0; for(int i=1;i-499 && las_y[i]>=1) { a=las_x[i]+500; b=las_y[i]; location[a][b]=location[a][b]+1; } } //finding location of trees int m=0;int count; int* meanlocationx= new int[3000]; int* meanlocationy= new int[3000]; for(int i=1;i<=2950;i+=20) for(int j=1;j<=950;j+=20) { count=0; for(int k=1;k<=50;k++) for(int l=1;l<=50;l++) if(location[j+1][i+k]>0) {count=count+location[j+l][i+k];} if(count>5) { m=m+1; meanlocationx[m]=j; meanlocationy[m]=i; } } //finding path for vehicle int i=1;k=0; int leftmax; int rightmax; for(int j=250;j<=3000;j+=250) 292

PAGE 293

{ if(i>1) { if((meanlocationy[i]-meanlocationy[i-1])>=250) { turn=meanlocationy[i-1]+51; break; } } leftmax=500;rightmax=500; while(meanlocationy[i]+50<=j) { if(meanlocationx[i]+50<500) //left side tree { if(leftmax<500 && leftmax500 && meanlocationx[i]500) //right side { rightmax=meanlocationx[i]; } else if(rightmax==500 && meanlocationx[i]>500) { rightmax=meanlocationx[i]; } if(turn==0 && i0) { break;} if(leftmax==0&& rightmax==1000) { k=k+1; if(j==250) { turn=j-245;} } else if(i==m) { 293

PAGE 294

k=k+1; turn=meanlocationy[m]+55; break; } } //turn if(turn==0) {turn=1;} //delete large arrays delete location; delete meanlocationx; delete meanlocationy; } // U-turn at headland // Input: yaw angle // Output: steering angle in pixel coordinates void HeadlandU() { //headland algorithm is taking over from vision if(MchnVsnAct) {MchnVsnAct=false;} if(headlandangle==400) { OutputDebugString("headland\n"); headlandangle=yaw; correction1=0; correction2=0; flag1=false; flag2=false; delayflag=false; //correcting headlandangle for anomalies //due to pyr sensor having range 0 to 360 //causing addition and subtraction problems //left turn if(headlandmode1==1 && headlandangle>=335) { correction1 = (headlandangle+25)-360;} else if(headlandmode1==1 && headlandangle<200) { correction2 = 360+(headlandangle-200);} //right turns else if(headlandmode1==2 && headlandangle<15) { correction1 = 360-(15-headlandangle);} 294

PAGE 295

else if(headlandmode1==2 && headlandangle>=200) //200+160=360 { correction2 = (200+headlandangle)-360;} } if(headlandmode1==1)//left turn { estr6h[0]='k'; // start of transmission data Gator.Write((BYTE *)estr6h,1); //right turn---------------------------------------------------//turn right till headlandangle>headlandangle+30 if(correction1!=0 && flag1==false) //extremes { OutputDebugString("cor right\n"); if(yaw=340) { if(delayflag==true) {Sleep(500);delayflag=false;} estr7h[0]=0; Gator.Write((BYTE *)estr7h,1); estr7h[0]=201; Gator.Write((BYTE *)estr7h,1); } } else if(yaw<(headlandangle+20) && flag1==false) //regular { if(delayflag==true) {Sleep(500);delayflag=false;} OutputDebugString("reg right\n"); estr7h[0]=0; Gator.Write((BYTE *)estr7h,1); estr7h[0]=201; Gator.Write((BYTE *)estr7h,1); } //left turn---------------------------------------------//full turn left until headlandangle-180 if(correction2!=0) //extremes { if(yaw>(headlandangle+20) || yaw>correction2-10) //>correction2 { OutputDebugString("cor left\n"); flag1=true; estr7h[0]=0; Gator.Write((BYTE *)estr7h,1); estr7h[0]=202; 295

PAGE 296

Gator.Write((BYTE *)estr7h,1); } else //u turn completed { //get out to alleywaymode OutputDebugString("cor exit"); pathend=false; headlandflag=false; estr7h[0]=100; Gator.Write((BYTE *)estr7h,1); estr7h[0]=0; Gator.Write((BYTE *)estr7h,1); } } else //regular { //OutputDebugString("regular left\n"); if(yaw>(headlandangle+20) && flag1==false) { OutputDebugString("1\n"); flag1=true; } else if(flag1==true && yaw>(headlandangle-150)) // { OutputDebugString("reg left\n"); estr7h[0]=0; Gator.Write((BYTE *)estr7h,1); estr7h[0]=202; Gator.Write((BYTE *)estr7h,1); } else if(yaw<(headlandangle-150)) //<(headlandangle-200) //u turn completed { //get out to alleywaymode OutputDebugString("reg exit"); headlandflag=false; pathend=false; estr7h[0]=100; Gator.Write((BYTE *)estr7h,1); estr7h[0]=0; Gator.Write((BYTE *)estr7h,1); flag1=false; } } estr6h[0]='m'; //end of transmission data Gator.Write((BYTE *)estr6h,1); 296

PAGE 297

} else if(headlandmode1==2)//right turn { sprintf(textv,"head:%4.2f\n",headlandangle); OutputDebugString(textv); estr6h[0]='k'; // start of transmission data Gator.Write((BYTE *)estr6h,1); //left turn---------------------------------------------------//turn left till headlandanglecorrection1 || yaw<15) { if(delayflag==true) { Sleep(500);delayflag=false;} estr7h[0]=0; Gator.Write((BYTE *)estr7h,1); estr7h[0]=202; Gator.Write((BYTE *)estr7h,1); } } else if(yaw>(headlandangle-15) && flag1==false) //regular { if(delayflag==true) {Sleep(500);delayflag=false;} OutputDebugString("reg left\n"); estr7h[0]=0; Gator.Write((BYTE *)estr7h,1); estr7h[0]=202; Gator.Write((BYTE *)estr7h,1); } //right turn---------------------------------------------//full turn right until headlandangle+200 if(correction2!=0) //extremes { if(yaw<(headlandangle-15) && flag1==false) { flag1=true;} else if(flag1==true && (yaw>=180 || yaw<(correction2-10))) { OutputDebugString("cor rit\n"); estr7h[0]=0; Gator.Write((BYTE *)estr7h,1); estr7h[0]=201; 297

PAGE 298

Gator.Write((BYTE *)estr7h,1); } else if(flag1==true)//u turn completed { //get out to alleywaymode OutputDebugString("cor exit"); headlandflag=false; pathend=false; estr7h[0]=100; Gator.Write((BYTE *)estr7h,1); estr7h[0]=0; Gator.Write((BYTE *)estr7h,1); } } else //regular { if(yaw<(headlandangle-15) && flag1==false) { flag1=true; } else if(flag1==true && yaw<(headlandangle+165)) { estr7h[0]=0; Gator.Write((BYTE *)estr7h,1); estr7h[0]=201; Gator.Write((BYTE *)estr7h,1); } else if(yaw>(headlandangle+165)) //(headlandangle+180) //u turn completed { //get out to alleywaymode headlandflag=false; pathend=false; estr7h[0]=100; Gator.Write((BYTE *)estr7h,1); estr7h[0]=0; Gator.Write((BYTE *)estr7h,1); flag1=false; } } estr6h[0]='m'; //end of transmission data Gator.Write((BYTE *)estr6h,1); } } void Headland3pt() 298

PAGE 299

{ //headland algo is taking over if(MchnVsnAct) {MchnVsnAct=false;} if(flag1==true && flag2==true && delayflag==true) { Sleep(4000); delayflag=false;} if(headlandangle==400) { headlandangle=yaw; correction1=0; correction2=0; flag1=false; flag2=false; delayflag=true; delayflag1=false; //correcting headlandangle for anomalies //due to pyr sensor having range 0 to 360 //causing addition and subtraction problems //left turn if(headlandmode1==1 && headlandangle<=90) //left { correction1 = 360-(90-headlandangle);} //right turns else if(headlandmode1==2 && headlandangle>=270)//right { correction1 = (headlandangle+90)-360;} if(headlandmode1==1 && headlandangle<=200) //left { correction2 = 360+(headlandangle-200);} else if(headlandmode1==2 && headlandangle>=160)//right { correction2 = (headlandangle+200)-360;} } if(headlandmode1==1)//left turn { estr6h[0]='k'; // start of transmission data Gator.Write((BYTE *)estr6h,1); //left turn---------------------------------------------------//turn left till headlandangle(correction1+10) || yaw<90) { 299

PAGE 300

estr7h[0]=0; Gator.Write((BYTE *)estr7h,1); estr7h[0]=202; Gator.Write((BYTE *)estr7h,1); } else {flag1=true;} } else if(correction1==0 && yaw>(headlandangle-90) && flag1==false) //regular { if(delayflag1==false) {Sleep(2000);delayflag1=true;} estr7h[0]=0; Gator.Write((BYTE *)estr7h,1); estr7h[0]=202; Gator.Write((BYTE *)estr7h,1); } else { flag1=true;} //90deg turn completed //left turn---------------------------------------------//full turn left until headlandangle-180 if(flag1==true) { if(flag2==false) //reverse { estr7h[0]=0; Gator.Write((BYTE *)estr7h,1); estr7h[0]=203; Gator.Write((BYTE *)estr7h,1); {flag2=true;} } else //reverse over now turn { if(correction2!=0) //extreme { if(yaw>(correction2+50) || yaw<110) //200-90=110 { estr7h[0]=0; Gator.Write((BYTE *)estr7h,1); estr7h[0]=202; Gator.Write((BYTE *)estr7h,1); } else //completed so exit to alleyway mode { 300

PAGE 301

headlandflag=false; pathend=false; estr7h[0]=100; Gator.Write((BYTE *)estr7h,1); estr7h[0]=0; Gator.Write((BYTE *)estr7h,1); } } else //regular { if(yaw>(headlandangle-150)) { estr7h[0]=0; Gator.Write((BYTE *)estr7h,1); estr7h[0]=202; Gator.Write((BYTE *)estr7h,1); } else { headlandflag=false; pathend=false; estr7h[0]=100; Gator.Write((BYTE *)estr7h,1); estr7h[0]=0; Gator.Write((BYTE *)estr7h,1); } } } //rev over } estr6h[0]='m'; //end of transmission data Gator.Write((BYTE *)estr6h,1); } else if(headlandmode1==2)//right turn { estr6h[0]='k'; // start of transmission data Gator.Write((BYTE *)estr6h,1); //rit turn---------------------------------------------------//turn rit till yaw>yaw+90 if(correction1!=0 && flag1==false) //extremes { if(delayflag1==false) {Sleep(3000);delayflag1=true;} OutputDebugString("cor rit 90\n"); if(yaw<(correction1-20) || yaw>270) 301

PAGE 302

{ estr7h[0]=0; Gator.Write((BYTE *)estr7h,1); estr7h[0]=201; Gator.Write((BYTE *)estr7h,1); } } else if(yaw<(headlandangle+70) && flag1==false) //regular { if(delayflag1==false) {Sleep(3000);delayflag1=true;} OutputDebugString("reg rit 90\n"); estr7h[0]=0; Gator.Write((BYTE *)estr7h,1); estr7h[0]=201; Gator.Write((BYTE *)estr7h,1); } else { flag1=true;} //90deg turn completed //left turn---------------------------------------------//full turn left until headlandangle-180 if(flag1==true) //extremes { if(flag2==false) //reverse { OutputDebugString("rev"); estr7h[0]=0; Gator.Write((BYTE *)estr7h,1); estr7h[0]=203; Gator.Write((BYTE *)estr7h,1); {flag2=true;} } else //reverse over now turn { if(correction2!=0) //extreme { if(yaw160) //160+200=360 { OutputDebugString("cor rit 180\n"); estr7h[0]=0; Gator.Write((BYTE *)estr7h,1); estr7h[0]=201; Gator.Write((BYTE *)estr7h,1); } else //completed so exit to alleyway mode 302

PAGE 303

{ OutputDebugString("cor exit"); headlandflag=false; pathend=false; estr7h[0]=100; Gator.Write((BYTE *)estr7h,1); estr7h[0]=0; Gator.Write((BYTE *)estr7h,1); } } else //regular { if(yaw<(headlandangle+160)) { OutputDebugString("reg rit 180\n"); estr7h[0]=0; Gator.Write((BYTE *)estr7h,1); estr7h[0]=201; Gator.Write((BYTE *)estr7h,1); } else { OutputDebugString("reg exit"); headlandflag=false; pathend=false; estr7h[0]=100; Gator.Write((BYTE *)estr7h,1); estr7h[0]=0; Gator.Write((BYTE *)estr7h,1); } } } //rev over } estr6h[0]='m'; //end of transmission data Gator.Write((BYTE *)estr6h,1); } } 303

PAGE 304

APPENDIX G CODE SEGMENT FOR OPEN FIELD NAVIGATION AND OBSTACLE DETECTION //Code for GPS based open field navigation // Read the way point coordinates void ReadGpsCoord() { GPSdistance=2; printflag=false; Steer=200; //open file and read GPS coordinates for navigation fpGPS = fopen("gps.txt", "r+"); fpGPSdebug = fopen("gpsdebug.txt","w+"); if(fpGPS==NULL) { OutputDebugString("Error: can't open file.\n"); } else { OutputDebugString("gps coordinates file opened\n"); i = 0 ; while(!feof(fpGPS)) { i++; /* loop through and store the coordinates into the array */ fscanf(fpGPS,"%lf",&numbers[i-1]); n=(i-1)/2; } } fclose(fpGPS); OutputDebugString("gps coordinates file closed\n"); sprintf((char*) gpstext,"%d,%d\n",i-1,n); OutputDebugString(gpstext); i=1; } void GPSnavigate() { if(i<=2*n)// Loop until all way points are reached { if(GPSdistance<1)//reached current coordinate 304

PAGE 305

{ i=i+2; OutputDebugString("next coord\n"); }//go to next coordinate if(i>2*n)//finished all { sprintf((char*) gpstext,"%d,%d\n",i,n); OutputDebugString(gpstext); OutputDebugString("GPS navigation completed\n"); estr4[0]='k'; // start of transmission data Gator.Write((BYTE *)estr4,1); //Center the steering before quitting estr5[0]=100; //hi Gator.Write((BYTE *)estr5,1); estr5[0]=0;//lo Gator.Write((BYTE *)estr5,1); //end of transmission data estr4[0]='m'; Gator.Write((BYTE *)estr4,1); //quit navigation estr4[0]='q'; Gator.Write((BYTE *)estr4,1); } else { dlat=numbers[i-1]-lati; dlon=numbers[i]-longi; a=sin((dlat/2)*pi/180)*sin((dlat/2)*pi/180)+cos(lati*pi/180)*cos( numbers[i1]*pi/180)*sin((dlon/2)*pi/180)*sin((dlon/2)*pi/180); c=2*atan2(sqrt(a),sqrt(1-a)); GPSdistance=6378140*c; GPSbearing=fmod(atan2(sin(dlon*pi/180)*cos(numbers[i1]*pi/180),cos(lati*pi/180)*sin(numbers[i-1]*pi/180)sin(lati*pi/180)*cos(numbers[i1] *pi/180)*cos(dlon*pi/180)), 2*pi); // Shift bearing from (0 to pi & -pi to 0) to 0 to 2pi GPSbearing=pi-GPSbearing; // this shift helps in matching the bearings range to yaw range values from IMU GPSbearingD=GPSbearing*180/pi; //convert gps bearing to degrees //start steering 305

PAGE 306

//0.33 = 30/90 for steering range = steer angle of 0-30 for a yaw range of 0-90 //Determine steering from the bearing and yaw if(yaw>GPSbearingD) { if(yaw-GPSbearingD>180) { steerangle=(GPSbearingD+360-yaw)*0.333; //6.29=2pi } else { steerangle=(GPSbearingD-yaw)*0.333; } } else if(yaw180) { steerangle=-(yaw+360-GPSbearingD)*0.333; } else { steerangle=(GPSbearingD-yaw)*0.333; } } if(steerangle>30) { steerangle=30; } else if(steerangle<-30) { steerangle=-30; } steer=(steerangle+30)*6.66; //400/60=6.66 convert 0to60 to 0to400 //steering is in opposite direction? //steer=400-steer; //steer=200+steer; //Median filter data to reduce noise 20 data for(int i=20;i>=1;i--) arrg[i]=arrg[i-1]; // move data in he array after removing the oldest 306

PAGE 307

arrg[0]=steer; // get the new data in the array for(int i=0;i<=20;i++) arrg1[i]=arrg[i]; //make a copy of the array for(int j=0;j<=21;j++) // sort more than half the data, median is already found { for(int i=0;i<20-j;i++) { if(arrg1[i]>arrg1[i+1]) //sort data { tempg=arrg1[i+1]; arrg1[i+1]=arrg1[i]; arrg1[i]=tempg; } } } steer=arrg1[10]; steer=200; // Send steering data to PC104 computer estr4[0]='k'; // start of transmission data Gator.Write((BYTE *)estr4,1); estr5[0]=steer/2; //hi Gator.Write((BYTE *)estr5,1); estr5[0]=0;//lo Gator.Write((BYTE *)estr5,1); estr4[0]='m'; //end of transmission data Gator.Write((BYTE *)estr4,1); } } } void LadarObstacleDetection() { obstacle_sighted=false; obstacleflag=false; for(int i=0;i
PAGE 308

} //initialize ground distance to ladar if(obstgrnd==0) {obstgrnd=y_dist[180];} for(int i=220;i>140;i--) // scan { if(y_dist[i]obstgrnd-50) { obstacle_sighted=false; //disagree break; } } //If obstacle found, stop checking left if(obstacle_sighted==true)//decided { obstacleflag=true; estr4[0]='k'; // start of transmission data Gator.Write((BYTE *)estr4,1); estr4[0]=0; Gator.Write((BYTE *)estr4,1); estr4[0]=204; Gator.Write((BYTE *)estr4,1); estr4[0]='m'; //end of transmission data Gator.Write((BYTE *)estr4,1); break; } } } } 308

PAGE 309

LIST OF REFERENCES Abdelnour, G., S. Chand, and S. Chiu. 1993. Applying fuzzy logic to the Kalman filter divergence problem. In: Proc. International Conference on Systems, Man and Cybernetics, Vol. 1: 630-635. Ahamed, T., T. Takigawa, M. Koike, T. Honma, A. Yoda, H. Hasegawa, P. Junyusen, and Q. Zhang. 2004. Characterization of laser range finder for in-field navigation of autonomous tractor. In: Proc. Automation Technology for Off-Road Equipment, Kyoto, Japan. Paper No. 701P1004. Benson, E.R, J.F. Reid, and Q. Zhang. 2001. Machine vision based steering system for agricultural combines. ASAE Paper No. 011159, St. Joseph, Mich.: ASABE. Benson, E. R., J. F. Reid, and Q. Zhang. 2003. Machine vision based guidance system for an agricultural smallgrain harvester. Transactions of the ASAE, Vol. 46(4): 1255. Blackmore, S., H. Have and S. Fountas. 2002. A specification of behavioural requirements for an autonomous tractor. In: Proc. Automation Technology for Off-Road Equipment, Chicago, Illinois. Pp. 033-042. Brown, G.K.,2002. Mechanical harvesting systems for the Florida citrus juice industry, ASAE Paper No.021108, St. Joseph, Mich.: ASABE. Burks, T.F., V. Subramanian, and S. Singh.2004. Autonomous greenhouse sprayer vehicle using machine vision and ladar for steering control. In: Proc. Automation Technology for Off-Road Equipment, Kyoto, Japan. Paper No. 701P1004. Carmer, D.C. and L.M. Peterson. 1996. Laser radar in robotics. In: Proceedings of the IEEE Vol. 84(2). Coulter, R.C. 1992. Implementation of the pure pursuit path tracking algorithm. CMU-RI-TR-92-01, Robotics Institute, Carnegie Mellon University, Pittsburg, Pennsylvania. Davis, I.L., and A. Stentz. 1995. Sensor fusion for autonomous outdoor navigation using neural networks. In: Proc. IEEE/RSJ International Conference on Intelligent Robots and Systems, Vol. 3: 338-343. Enomoto, K., T. Yamasaki, H. Takano and Y. Baba, 2007. Automatic following for UAVs using dynamic inversion. In: Proc. SICE Annual Conf. Pp. 2240-2246. Fitzgerald R. 1971. Divergence of the Kalman filter. IEEE Transactions on Automatic Control Vol. 16(6): 736-747. Gage, A., and R. R. Murphy. 2000. Sensor allocation for behavioral sensor fusion using min-conflict with happiness. In: Proc. IEEE/RSJ International Conference on Intelligent Robots and Systems, Vol. 3: 1611-1618. 309

PAGE 310

Garcia-Ortiz, A., 1993. Adaptive vehicle navigation in hostile environments. Canadian Conference on Electrical and Computer Engineering, Vol.1: 111 114. Gerrish, J.B., B.W. Fehr, G.R. Van Ee, and D.P. Welch. 1997. Self steering tractor guided by computer vision. Applied Engineering in Agriculture, ASAE, Vol. 13(5). Gordon, G.P., and R.G. Holmes. 1988. Laser Positioning system for off-road vehicles. ASAE Paper No. 88-1603, St. Joseph, Mich.: ASABE. Hague, T., J. Marchant and N. Tillet ,1997. Autonomous robot navigation for precision horticulture. In: Proc. IEEE Conf. Robotics & Automation, NM. Han, S., Q. Zhang, and H. Noh. 2002. Kalman filtering of DGPS positions for a parallel tracking application. Transactions of the ASAE 45(3): 553-559. Hansen. A.C., Q. Zhang, T. Wilcox, and R.H. Hornbaker, 2005. Modeling and analysis of row crop harvesting patterns by combines. ASAE Paper No. 053051, St. Joseph, Mich.: ASABE. Han, S, M. A. Dickson, B. Ni, J. F. Reid, and Q. Zhang. 2002. A robust procedure to obtain a guidance directrix for vision based vehicle guidance systems. In: Proc. Automation Technology for Off-Road Equipment, Chicago, Illinois. Paper 701P0509 pp. 317-326. Hodges, A., E. Philippakos, D. Mulkey, T. Spreen, and R. Muraro, 2001. Economic impact of Floridas citrus industry, 1999-2000. Extension Digital Information Source (EDIS) FE307. Gainesville, Fla.: University of Florida, Department of Food and Resource Economics. Iida, M., and T. F. Burks. 2002. Ultrasonic sensor development for automatic steering control of orchard tractor. In: Proc. Automation Technology for Off-Road Equipment, Chicago, Illinois. pp 221-229. Kise, M., N. Noguchi, K. Ishii and H. Terao,2002.Enhancement of turning accuracy by path planning for robot tractor. In: Proc. Automation Technology for Off-road Equipment, ASAE paper No. 701P0502. Klein, L.A. (Eds.), 1999. Sensor and data fusion concepts and applications. WA, USA .: SPIE Press. Kobayashi, K., K. C. Cheok, K. Watanabe and F. Munekata. 1998. Accurate differential global positioning system via fuzzy logic Kalman filter sensor fusion technique. IEEE Transactions on Industrial Electronics Vol. 45(3): 510-518. Kodagoda, K.R.S.,W.S. Wijesoma, and E.K. Teoh. 2002. Fuzzy speed and steering control of an AGV. IEEE transactions on Control Systems Technology, Vol. 10(1) Pp. 112-120. 310

PAGE 311

Matsuo, Y., S. Yamamoto, and O. Yukumoto. 2002. Development of tilling robot and operation software. In: Proc. Automation Technology for Off-road Equipment, Chicago, Illinois. Pp184-189. Miller, M., B. Steward and M. Westphalen, 2004. Effects of multi-mode four-wheel steering on sprayer machine performance. Transactions of ASAE. Vol. 47(2): 385-395. Mizushima, A., N. Noguchi, and K. Ishii, and H. Terao. 2002. Automatic navigation of the agricultural vehicle by the geomagnetic direction sensor and gyroscope. In: Proc. Automation Technology for Off-road Equipment, Chicago, Illinois. Pp.204-211. Mobus, R., and U. Kolbe. 2004. Multi-target multi-object tracking, sensor fusion of radar and infrared. In: Proc. IEEE Intelligent Vehicles Symposium, 732-737. Morimoto, E., M. Suguri, M. Umeda. 2002. Obstacle avoidance system for autonomous transportation vehicle based on image processing. Agricultural Engineering International. CIGR Journal of Scientific Research and Development, Vol 4. Pp 146-154. Murakami, N., K. Inoue, and S. Miyaura. 2004.Teleoperation platform for HST drive agricultural vehicle. In: Proc. Automation Technology for Off-road Equipment, Kyoto, Japan. Paper 701P1004. Nagasaka,Y., N. Umeda, and Y. Kanetai. 2002. Automated rice transplanter with GPS and FOG. In: Proc. Automation Technology for Off-road Equipment, Chicago, Illinois. pp 190-195. Nagasaka, Y., N. Umeda, Y. Kanetai, K. Tanikawi, and Y. Sasaki. 2004. Automated rice transplanter using global positioning and gyroscopes. Computers and Electronics in Agriculture 43(3): 223-234. Nijhuis, J.,S. Neuber, J. Heller, and J. Sponnemann. 1992. Evaluation of fuzzy and neural vehicle control. CompEuro '92, Proc. Computer Systems and Software Engineering, pp 447 452. Nishiwaki, K., T. Tatsushi, and K. Amaha. 2002. Vision based speed and yaw angle measurement system. In: Proc. Automation Technology for Off-road Equipment, Chicago, Illinois. pp 212-220. Noguchi, N., M. Kise, K. Ishii, and H. Terao. 2002. Field automation using robot tractor. In: Proc. Automation Technology for Off-road Equipment, Chicago, Illinois.pp 239-245. Noguchi, N., J.F. Reid, Q. Zhang and J. Will, 2001. Turning function for robot tractor based on spline function. ASAE Paper No. 01-1196, St.Joseph, Mich.: ASABE. Oksanen, T., and A. Visala, 2004. Optimal control of tractor-trailer system in headlands. In: Proc. Automation Tech. for off-road equipment. Kyoto, Japan. 311

PAGE 312

Passino, K.M, and S. Yurkvovich. 1998. Fuzzy Control. California, USA.: Addison Wesley Longman, Inc. Paul A. S., and E. A. Wan. 2005. Dual Kalman filters for autonomous terrain aided navigation in unknown environments. In: Proc. IEEE International Joint Conference on Neural Networks, 5: 2784-2789. Petrinec, K., Z. Kovacic and A. Marozin, 2003. Simulator of multi-AGV robotic industrial environments. In: Proc. IEEE Inter. Conf. on Industrial Tech. Vol. 2: 979-983. Qiu, H., Q. Zhang, J.F. Reid, and D. Wu. 2001. Modelling and simulation of an electrohydraulic steering system. International Journal of Vehicle Design, Vol. 26, pp. 161-174. Rasmussen, C. 2002. Combining laser range, color, and texture cues for autonomous road following. In: Proc. IEEE International Conference on Robotics and Automation, Vol. 4: 4320-4325. Reid, J.F. 2004. Mobile intelligent equipment for off-road environments, In: Proc. Automation Technology for Off-road Equipment, Kyoto, Japan. Rouse, B., and S. Futch, 2004, Start Now to Design Citrus Groves for Mechanical Harvesting, HS974, IFAS Extension, University of Florida, Gainesville, FL,. Rovira-Ms, F., Q. Zhang, J. F. Reid and J. D. Will. 2002. Machine vision row crop detection using blob analysis and the hough transform. In: Proc. Automation Technology for Off-Road Equipment, Chicago, Illinois. Paper 701P0509 pp. 327-336. Runkler, T., M. Sturm and H. Hellendoorn. 1998. Model based sensor fusion with fuzzy clustering. In: Proc. IEEE International Conference on Fuzzy Systems. 2: 1377-1382. Sasladek, J.Z, and Q. Wang. 1999. Sensor fusion based on fuzzy Kalman filtering for autonomous robot vehicle. In: Proc. of IEEE International Conference on Robotics and Automation, 4: 2970-2975. Senoo, S., M. Mino, and S. Funabiki. 1992. Steering control of automated guided vehicle for steering energy saving by fuzzy reasoning. In: Proc. IEEE Industry Applications Society, Vol. 2, pp 1712 Shin, B.S.,S.H. Kim, and J.U. Park. 2002. Autonomous agricultural vehicle using overhead guide. In: Proc. Automation Technology for Off-Road Equipment. Chicago, Illinois. Paper 701P0509 Pp. 261-269. Stombaugh, T.S., E.R. Benson, and J.W. Hummel. 1999. Guidance control agricultural vehicles at high field speeds. Transactions of the ASAE. Vol. 42(2). 312

PAGE 313

Subramanian, V., T. F. Burks, and S. Singh. 2005. Autonomous greenhouse sprayer vehicle using machine vision and laser radar for steering control. Applied Engineering in Agriculture 21(5): 935-943. Subramanian, V., T. F. Burks, and A. A. Arroyo. 2006. Development of machine vision and laser radar based autonomous vehicle guidance systems for citrus grove navigation. Computers and Electronics in Agriculture 53(2): 130-143. Takahashi, T., S. Zhang, and H. Fukuchi .2002. Acquisition of 3-D information by binocular stereo vision for vehicle navigation through an orchard. In: Proc. Automation Technology for Off-Road Equipment, Chicago, Illinois.Paper 701P0509 pp. 337-346. Tsubota, R., N. Noguchi, and A. Mizushima. 2004. Automatic guidance with a laser scanner for a robot tractor in an orchard. In: Proc. Automation Technology for Off-road Equipment, Kyoto, Japan. Paper701P1004. Wu, H., M. Siegel, R. Stiefelhagen, and J. Yang. 2002. Sensor fusion using dempster Schafer theory. In: Proc. IEEE Instrumentation and Measurement Conference, 1:7-12. Yekutieli, O., and F. G. Pegna. 2002. Automatic guidance of a tractor in a vineyard. In: Proc. Automation Technology for Off-Road Equipment, Chicago, Illinois.Paper 701P0502. Pp. 252-260. Yokota,M.,, A. Mizushima, K. Ishii, and N. Noguchi 2004. 3-D map generation by a robot tractor equipped with a laser range finder. In: Proc. Automation Technology for Off-road Equipment, Kyoto, Japan.Paper 701P1004. Zarchan, P., and H. Musoff. (Eds.), 2005. Fundamentals of Kalman filtering: A practical approach. Progress in Astronautics and Aeronautics, 208. Virginia, USA, AIAA 2. Zhang, Q., J.F. Reid, and N. Noguchi. 2000. Automatic guidance control for agricultural tractor using redundant sensor. SAE Transactions, Journal of Commercial Vehicles, Vol. 108, pp 27-31. Zhang, Q.,S. Cetinkunt, T. Hwang, Pinsopon, M. A. Cobo, and R. G. Ingram. 2001. Use of adaptive control algorithms for automatic calibration of electrohydraulic actuator control. Applied Engineering in Agriculture, Vol. 17(3). Zhang, Q., and H. Qiu. 2004. Dynamic path search algorithm for tractor automatic navigation. Transactions of the ASAE, Vol. 47(2): 639-646. 313

PAGE 314

BIOGRAPHICAL SKETCH Vijay Subramanian graduated high school from Kendriya Vidyalaya in 1998. He received his Bachelor of Engineering degree in electrical and electronics engineering from Annamalai University in 2002. He received the Master of Science degree in electrical and computer engineering in 2004 and the Master of Science degree in agricultural and biological engineering, specializing in robotics, in 2005, both from the University of Florida. 314