<%BANNER%>

Citrus Yield Mapping System Using Machine Vision


PAGE 1

CITRUS YIELD MAPPING SYS TEM USING MACHINE VISION By PALANIAPPAN ANNAMALAI A THESIS PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY OF FLOR IDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF MASTER OF SCIENCE UNIVERSITY OF FLORIDA 2004

PAGE 2

Copyright 2004 by Palaniappan Annamalai

PAGE 3

I would like to dedicate this work to my parents for whom my education meant everything.

PAGE 4

ACKNOWLEDGMENTS The process of earning my masters has been a lengthy process, in which many people have offered invaluable assistance. I would like to thank my advisor Dr. Won Suk Daniel Lee, who instructed me and always supported me. I am privileged to have collaborated with him on this project. His advice and ideas have been the pillars that have held this work to a higher standard. My appreciation goes to committee members Dr. Thomas F. Burks and Dr. James J. Ferguson for encouragement and guidance to complete this work. I also wish to thank Dr. Kelly T. Morgan and Mr. Jae Hyuk Choi for their assistance during field operations. Thanks go to my parents and family members for their continuous prayers, concern and support. Finally, I would like to thank all of those not explicitly mentioned here who have aided my intellectual and social growth throughout my academic career. iv

PAGE 5

TABLE OF CONTENTS page ACKNOWLEDGMENTS.................................................................................................iv LIST OF TABLES............................................................................................................vii LIST OF FIGURES.........................................................................................................viii ABSTRACT.........................................................................................................................x CHAPTER 1 INTRODUCTION........................................................................................................1 2 LITERATURE REVIEW.............................................................................................4 2.1 Citrus Production in Florida...................................................................................4 2.2 Citrus Harvesting....................................................................................................6 2.3 Precision Agriculture and Yield Mapping..............................................................7 2.4 Citrus Yield Mapping.............................................................................................9 2.5 Image Segmentation.............................................................................................11 3 MATERIALS AND METHODS...............................................................................15 3.1 Overview of Experiment.......................................................................................15 3.2 System Hardware..................................................................................................18 3.2.1 Color Vision Hardware...............................................................................18 3.2.2 DGPS Receiver and Encoder......................................................................19 3.3 Image Acquisition.................................................................................................20 3.4 Image Analysis using HSI Color Model...............................................................21 3.5 Development of the Fruit Counting Algorithm....................................................22 3.6 Image Processing Time.........................................................................................25 3.7 Experimental Procedure........................................................................................26 3.7 Prediction of number of fruits/plot.......................................................................28 3.8 Yield Prediction Model.........................................................................................29 3.9 Performance of the Fruit Counting Algorithm.....................................................30 3.10 Performance of the Yield Prediction Model.......................................................31 3.11 Yield Variability based on Rootstock Variety....................................................31 v

PAGE 6

4 RESULTS AND DISCUSSION.................................................................................33 4.1 Binarization...........................................................................................................33 4.2 Preprocessing........................................................................................................40 4.3 Execution time for the Algorithm.........................................................................49 4.4 Encoder Calibration..............................................................................................49 4.5 Prediction of Number of Fruits/Plot.....................................................................51 4.6 Yield Prediction Model.........................................................................................55 4.7 Yield Variability based on Rootstock Variety......................................................63 5 SUMMARY AND CONCLUSIONS.........................................................................64 5.1 Summary and Conclusions...................................................................................64 5.2 Future Work..........................................................................................................65 LIST OF REFERENCES...................................................................................................67 BIOGRAPHICAL SKETCH.............................................................................................70 vi

PAGE 7

LIST OF TABLES Table page 4-1. Number of images used in calibration and validation data sets.................................33 4-2. Pixel distribution for citrus, leaf and background classes for 25 images in HSI color plane.........................................................................................................................37 4-3. Threshold for identification of fruit clusters...............................................................43 4-4. Performance comparison for 329 validation images..................................................44 4-5. Execution time for each image processing step..........................................................49 4-6. Encoder Calibration. The number of pulses are shown for different trials................50 4-7. Actual harvested yield data.........................................................................................51 4-8. Number of plots in calibration and validation data sets to develop prediction models.53 4-9. Performance comparison of the yield prediction model for 22 plots.........................57 4-10. Yield category for 22 plots.......................................................................................58 4-11. Fruits/plot for 48 plots grouped using means test.....................................................63 vii

PAGE 8

LIST OF FIGURES Figure page 3-1. Mapping of trees in the citrus grove. Two trees designated for hand harvesting are shown inside the red rectangular box for each plot..................................................16 3-2. Experimental setup.....................................................................................................17 3-3. Components of the imaging board..............................................................................18 3-4. Encoder attached to the wheel of the metal frame......................................................20 3-5. Image processing steps of the fruit counting algorithm.............................................25 3-6. Schematic diagram of the overall citrus yield mapping system.................................26 3-7. Algorithm for field-testing of the machine vision system..........................................27 4-1. Histogram of 25 calibration images in hue plane (* indicates 3898 pixels of a pixel value of 0 in citrus class,+ indicates 5606 pixels of a pixel value of 0 in background class).........................................................................................................................34 4-2. Histogram of 25 calibration images in saturation plane (* indicates 3872 pixels of a pixel value of 0 in citrus class,+ indicates 5604 pixels of a pixel value of 0 in background class, # indicates 2736 pixels of a pixel value of 254 in citrus class)..34 4-3. Histogram of 25 calibration images in luminance plane (* indicates 3356 pixels of a pixel value of 252 in citrus class).............................................................................35 4-4. Histogram of 25 calibration images in red plane........................................................35 4-5. Histogram of 25 calibration images in green plane (* indicates 3330 pixels of a pixel value of 252 in citrus class)......................................................................................36 4-6. Histogram of 25 calibration images in blue plane (* indicates 3319 pixels of a pixel value of 252 in citrus class)......................................................................................36 4-7. Pixel distribution in 25 calibration images in hue-saturation plane...........................39 4-8. Pixel distribution in 25 calibration images in hue-luminance plane..........................39 4-9. Pixel distribution in 25 calibration images in red-green plane...................................40 viii

PAGE 9

4-10. Image processing steps of a typical citrus grove scene............................................41 4-11. Regression analysis between the number of fruits counted by human observation and the number of fruits counted by the fruit counting algorithm...........................48 4-12. Regression analysis for encoder calibration.............................................................50 4-13. Regression analysis between NA and NP fruits ............................................................54 4-14. Regression analysis between NA and NP pixels ...........................................................54 4-15. Regression analysis between NA and NP fruits-pixels ....................................................55 4-16. Regression analysis between yield prediction model and the actual harvested yield.58 4-17. Performance of yield prediction model (fruits/m 2 )...................................................60 4-18. Yield mapping for citrus fruits (fruits/m 2 )................................................................61 4-19. Yield based on number of citrus fruits in an image (fruits/m 2 )................................62 4-20. Yield variability based on rootstock variety.............................................................63 ix

PAGE 10

Abstract of Thesis Presented to the Graduate School of the University of Florida in Partial Fulfillment of the Requirements for the Degree of Master of Science CITRUS YIELD MAPPING SYSTEM USING MACHINE VISION By Palaniappan Annamalai August 2004 Chair: Won Suk Daniel Lee Major Department: Agricultural and Biological Engineering A machine vision system utilizing color vision was investigated as a means to identify citrus fruits and to estimate yield information of the citrus grove in real-time. The yield mapping system was calibrated and tested in a commercial citrus grove. Results were compared for the yield estimated through the system and that carried out by hand harvesting. This study focused on three major issues: 1. Development of a hardware system consisting of a color CCD camera, an imaging board, an encoder and a DGPS receiver; 2. Development of an algorithm to take non-overlapping images of the citrus grove and an image-processing algorithm to identify and count the number of citrus fruits from an image; 3. Development of a yield estimation model that will predict the number of fruits per tree based on the images of the tree. Images were acquired for 98 citrus trees in a grove located near Orlando, Florida. The trees were distributed over 48 plots evenly. Images were taken in stationary mode using a machine vision system consisting of a color analog camera, a DGPS receiver, and an encoder. Non-overlapping images of the citrus trees were taken by determining the x

PAGE 11

field of view of the camera and using an encoder to measure the traveled distance to locate the next position for acquiring an image. The encoder was calibrated in the grove before the field-testing of the system. Images of the citrus trees were analyzed and a histogram and pixel distribution of various classes (citrus fruit, leaf, and background) were developed in RGB and HSI color space. The threshold of segmentation of the images to recognize citrus fruits was estimated from the pixel distribution of the HSI color plane. A computer vision algorithm to enhance and extract information from the images was developed. Preprocessing steps for removing noise and identifying properly the number of citrus fruits were carried out using a threshold and a combination of erosion and dilation. The total time for processing an image was 119.5 ms, excluding image acquisition time. The image processing algorithm was tested on 329 validation images and the R 2 value between the number of fruits counted by the fruit counting algorithm and the average number of fruits counted manually was 0.79. Images belonging to the same plot were grouped together and the number of fruits estimated by the fruit counting algorithm was summed up to give the number of fruits/plot estimates. Leaving out outliers and incomplete data, the remaining 44 plots were divided into calibration and validation data sets and a model was developed for citrus yield using the calibration data set. The R 2 value between the number of fruits/plot counted by the yield prediction model and the number of fruits/plot identified by hand harvesting for the validation data set was 0.46. Although this study was conducted for citrus fruits, the concept could be easily applied with little modifications to estimate yield of most fruits that differ in color from the foliage. xi

PAGE 12

CHAPTER 1 INTRODUCTION Competitive farmers strive to increase crop yields while minimizing costs. With the advent of mechanization of agriculture and a trend towards larger equipment, farmers were able to cultivate very large areas, but many continued to treat their larger fields as a single management unit, thus ignoring variability found within a specific field. Precision farming, sometimes called site-specific farming, is an emerging technology that allows farmers to reduce costs through efficient and effective application of crop inputs for within-field variability in characteristics like soil fertility and weed populations. One of the major agricultural products in Florida is citrus. Floridas citrus industry produced about 12.4 million metric tons of citrus in the 2000 season, accounting for 76 percent of all citrus produced in the U. S. (Florida Agricultural Statistics Service, 2001). Citrus fruits, including oranges, grapefruit, tangelos, tangerines, limes, and other specialty fruits, are the state's largest agricultural commodities. Currently citrus groves are managed as blocks and the variability found within this unit is not generally considered for grove management. Citrus trees with sufficient water and nutrients grow stronger, better tolerate pests and stresses, yield more consistently, and produce good quality fruit than trees that receive excessive/deficient irrigation or fertilization. To a citrus grower who deals with thousands of trees in many blocks, site-specific management or precision agriculture provides the ability to apply technology and to manage inputs as closely as required within a given area. This management of field variability could improve fruit yield, quality, and income and limit negative impacts on 1

PAGE 13

2 sensitive environments. Among precision technologies, yield mapping is the first step to develop a site-specific crop management. Yield with other associated field characteristics would help growers manage their groves better, evaluate the entire citrus grove graphically and thus prepare them to make efficient decisions. Currently, a commercial citrus yield mapping system, named as Goat (GeoFocus, LLC, Gainesville, FL), is the only available citrus yield mapping system. This system is attached to a goat truck used primarily in citrus harvesting operations. In this system, the goat truck operator is required to push a button to record the location of every tub, which may be forgotten and often becomes a major source of error. Yield information is available only after the fruits are harvested and this system gathers yield data from multiple trees rather than from each individual tree. The overall goal of this research is to develop a real-time yield mapping system using machine vision and to provide yield of a grove on-the-go when mounted on a truck and driven in-between the rows. The system will identify citrus fruits from images using color information in real-time. The system will estimate citrus yield for a single tree while citrus yields are currently determined based on whole block or grove. More specifically, the objectives in this research are to: Develop a hardware system consisting of a color CCD camera, an imaging board, an encoder and a DGPS receiver, Develop an algorithm to take non-overlapping images of the citrus grove and an image-processing algorithm to identify and count the number of citrus fruits from an image, and Develop a yield estimation model that will predict the number of fruits per tree based on the images of the tree. A main advantage of the proposed system is that it would provide single-tree yield and could estimate citrus yield before the actual harvesting schedule. Estimated yield information could then be used for deciding various citrus management practices such as

PAGE 14

3 the amount of irrigation, application of herbicide to the plants and finally for scheduling grove equipment and pickers.

PAGE 15

CHAPTER 2 LITERATURE REVIEW This chapter begins with a brief introduction about citrus production in Florida, precision agriculture in general and the importance of yield mapping for site-specific crop management. A review of common citrus harvesting procedures was included to evaluate current harvesting trends in citrus industry. Current citrus yield mapping systems were analyzed to identify the desired features for the design of a better yield mapping system. Finally, research involving image segmentation and various image processing techniques were reviewed to form a basis for a color vision system. 2.1 Citrus Production in Florida Citrus trees are small evergreen trees that are grown in tropical and subtropical climates. As this perennial crop grows in regions with warmer climates (typically between 12.78-37.78 C), citrus trees are usually cultivated in regions situated at latitudes between 40 north and 40 south. Citrus cultivation requires periodical fertilization (3-4 times per year) and irrigation (every 7 to 28 days depending on season and soil type), as well as pruning (depending on tree planting density). Citrus trees usually start producing fruits in 3-5 years from planting although economic yields start from the fifth year and the trees may take 8-10 years to achieve full productivity. Fruits could be harvested 5 6 months after flowering, depending on the variety and the environment. Citrus trees are usually planted in long parallel rows in a grove and usually they are managed as a block instead of individual trees. A "block" is usually considered to be either a unit of land area separated by ditches or roads from adjacent planted area or a group of trees of the same 4

PAGE 16

5 variety and original rootstock. There is no specific size for a block. They could range from 5 acres to 500 acres. The orange is a favorite fruit among Americans. It has consistently ranked as the third most consumed fresh fruit behind bananas and apples and as a juice, it ranks number one (Pollack et al., 2003). According to the USDA National Agricultural Statistics Service, in 2002 citrus was grown on over 322,658 ha in Florida. Of that total area, 81.4 percent was committed for orange production, followed by grapefruit production (13.2 percent) and lastly, specialty fruit (e.g. tangerines, tangelos, lemons, limes, etc.) made up the final 5.4 percent (Florida Agricultural Statistics Service, 2002). Many researchers have published data about citrus producing regions in Florida. Hodges et al. (2001) says that: Citrus has been produced commercially in Florida since the mid-1800s. It is produced across the southern two-thirds of the Florida peninsula, where there is a low probability of damaging winter freeze events, from Putnam County in the north to Miami-Dade County in the south. The four major citrus producing regions are the East Coast, Lower Interior, Upper Interior, and West Coast Districts. In 1957 citrus production was centered in the Upper Interior District, with 40 percent of total citrus production, followed by the Lower Interior (30%), West Coast (17%), and the East Coast (12%). By 1999, the geographical distribution had shifted towards the Lower Interior District (61%), followed by the East Coast (24%), West Coast (8%), and the Upper Interior (6%). The southward migration of citrus production was a response to a series of freezes in the north central region of the state in the 1980s. (Hodges et al., 2001, pg. 2) Citrus fruits are not climateric and do not ripen further once they are detached from the tree. Hence care should be taken to harvest the fruits at the right stage of maturity. Citrus quality is measured using the following standard: Brix, sugar/acid ratio and percent juice. Once harvested, the fruit has to be graded, sorted, and washed before being packed for the fresh market or fruit processing plant.

PAGE 17

6 Forecasting citrus production for Florida is an annual task that helps in planning operations, marketing, and policy making, which are especially important to a crop harvested over several months and sold year round. The U.S. Department of Agriculture first made forecasts of Florida citrus production in 1918, based on surveyed opinions of crop observers and statisticians. The current system is based on objective data including an early-season, limb-count survey to establish actual fruit set, supplemented with monthly in-season measurements of fruit size and observations of fruit droppage. The forecast is based on estimates and projections from actual counts and measurements, avoiding observations based on opinion. The essential features used in the forecast are (1) number of bearing trees, (2) number of fruit per tree, (3) fruit size, and (4) fruit loss from droppage. 2.2 Citrus Harvesting Citrus was harvested in 322,657 ha in Florida during 2002 (Florida Agricultural Statistics Service, 2002). About 95% of harvested fruit was processed into juice and the remaining was used in fresh market. Citrus harvesting is a labor-intensive process involving large number of workers depending upon the size of the grove. Although extensive research has been conducted to automate citrus harvesting operations using robots and other mechanical harvesting techniques (Whitney et al., 1998; Whitney et al., 1999), approximately 99% of citrus crops in Florida are harvested manually. Custom manual harvesting is carried out with crews of 30 to 50 persons working with a ladder and picking bags. Fruits are gathered from trees and placed into containers (bins or tubs) that usually have a capacity of approximately 0.7 m 3 The containers are moved from the grove and finally loading into trucks. Due to a decrease in supply of

PAGE 18

7 workers and an increase in labor cost, alternate methods are assessed by growers to compete in the international citrus market. Mechanization of citrus fruit harvesting is expected to improve the efficiency of harvesting schedules and to minimize the dependence on labor requirements on the future. Whitney and Sumner (1977) developed shakers for removing citrus fruits from the tree. Coppock and Donhaiser (1981) developed a conical scan air shaker for removing citrus fruits. The shaker along with application of an abscission chemical removed fruits from trees at a rate of 170 trees/hr with an average removal efficiency of 97 percent. Brown (2002) described various aspects of eight mechanical harvesting systems for citrus. The main disadvantage of these methods is the damages caused to the fruit due to bruising when falling from the tree. Sometimes these bruises are too severe for fruits that are intended for processing. Schertz and Brown (1968) reviewed the basic principles of citrus harvesting systems utilizing robots to pick fruits from trees. Harrell et al. (1990) developed a mobile robotic grove lab to study the use of robotic technology for picking oranges under natural grove conditions. The picking robot consisted of a simple three degree of freedom manipulator actuated with servo-hydraulic drives. Fruit detection and positional information was accomplished with a color CCD video camera and an ultrasonic sensor. 2.3 Precision Agriculture and Yield Mapping Reducing input costs, minimizing work time, increasing yield, and improving crop quality to boost profit margin manyfold are the basic goals for any agricultural firm to compete in domestic and global markets. With high levels of mechanization in crop production, crops are managed based on units such as blocks rather than individual units. Precision agriculture is a management philosophy that responds to spatial variability

PAGE 19

8 found on agricultural landscapes. Steps involving precision agriculture include determining yield variability in a field, determining its cause, deciding on possible solutions based on economic justification, implementing new techniques and repeating the procedure in a cyclic approach. Precision agriculture techniques could be used to improve economic and environmental sustainability in crop production. Global positioning system (GPS), geographical information system (GIS), remote sensing (RS), variable rate technology (VRT), yield mapping, and advances in sensor and information technology have enabled the farmer to visualize the entire field in a way, that could help manage the agricultural operations efficiently and improve overall productivity. With precision agriculture technologies, the farmer could effectively manage the crop throughout its life cycle, starting from preparing soil, sowing seeds, applying fertilizers/pesticides and finally estimating yield during harvesting based on each individual plant, thus reducing the waste of resources due to in-field variability. Among precision agriculture technologies, yield mapping is the first step to implement site-specific crop management on a specific field. A yield mapping system measures and records the amount of crop harvested at any point in the field along with the position of the harvesting system. This collected data could be used to produce a yield map using mapping software. Yield maps are useful resources to identify variabilities within a field. Variability in an agricultural field is due to man-made or natural sources. A natural variability may be due to seasonal change in weather pattern or rainfall over several years. Examples of man-made variabilities include improper distribution of irrigation/drainage facilities for field and excessive/deficit application of farm inputs.

PAGE 20

9 Numerous yield monitoring and yield mapping systems have been widely researched and commercialized for various crops over the last one and a half decades. Yield mapping during grain harvesting (Schueller and Bae, 1987; Searcy et al., 1989) has been extensively studied and adopted. Examples of yield mapping for other crops include cotton (Wilkerson et al., 1994; Roades et al., 2000), potatoes (Campbell et al., 1994), tomatoes (Pelletier and Upadhyaya, 1999), and silage (Lee et al., 2002). Being able to evaluate the entire farm graphically, in an encapsulated picture, with respect to yield and other associated field characteristics would tremendously help farmers to more intimately know their field and thus help them to make important decisions in an efficient manner. 2.4 Citrus Yield Mapping The preliminary on-tree value of all citrus for the 2000-01 season in Florida was $760 million. In spite of the widespread economic importance of the citrus industry, currently the Goat system is the only commercial yield mapping system for citrus. Citrus yield monitoring systems have been under development for several years. The first yield monitor for citrus was developed by Whitney et al. (1998 and 1999) and Schueller et al. (1999). In the Goat yield mapping system, yield is measured by mapping the location of a tub as it is picked by a truck. Citrus harvest is a busy operation and therefore one of the prime goals of this system is to develop a yield measurement and mapping system that would not interfere with any of the current harvesting procedures. One advantage of this system is that there is no need for any change in the harvesting practice involving many field workers who are often relatively untrained in managing sophisticated equipment. A computer is used to coordinate all the operations of the Goat yield mapping system. It has a crop harvest tracking system (GeoFocus, LLC, Gainesville, FL) that records a container location (latitude, longitude, and time) whenever the operator pushes

PAGE 21

10 a button for recording. It also has an LCD display for easy operation, some other control parts for interfacing all the components and a data transfer system to transfer the collected data to a computer for further analysis. Data would be later retrieved from the unit and post processing is carried out to produce yield maps. The Goat system had the button and the LCD display mounted on the dash of the goat truck so that it is easily accessed by the driver. All the remaining electronics were placed inside a box and kept in a more secluded place inside the truck. It was noted that this system sometimes produced incorrect maps due to the fact that sometimes the truck driver failed to record the location of the tub because of the rush in harvest or other factors. To avoid the previously encountered problem, an automatic triggering system (Salehi et al., 2000) was developed to record the location of the tub. The system consisted of a GeoFocus yield monitor, a DGPS receiver, two pressure switches, a position switch, and two timers. The pressure switches were used to detect a load on the main boom lift cylinder and on the dump cylinder. The position switch determined whether the tipping head was located over the truck bulk bin. When all the three conditions were met, the system identified that the truck was picking a tub for collecting the fruit and the data gathering circuit was activated for a given time using the timer and relay circuit for collecting the DGPS data. However, the automatic triggering system didnt record some tub locations, which could be attributed to the problems related with the delay timer, pressure switch settings, and hardware connections. The economic value of the citrus industry in Florida makes precision farming a viable technology for enormous development. Recognizing citrus fruits on a tree is the

PAGE 22

11 first major task of a yield mapping system using a machine vision system. Citrus fruits are distributed in a strip about one meter deep within the canopy, in a completely unstructured environment. Detecting the citrus fruit on the tree involves many complex operations. Automatic visual identification of fruit is further complicated by variation in lighting conditions from bright sunlight on the outer parts of the canopy to deep shadow within the canopy. Citrus fruits often grow in clusters and also some of the fruits are occluded by branches and foliage. Fruit distribution was studied (Juste et al., 1988) with Salustiana and Washington navel using a system of cylindrical coordinates and it appears that around 80% of fruits were between the outer boundary and at a distance of 1 m 1.4 m from the outer canopy. But in the case of mandarins most of the fruits were at a distance of 0.75 m from the outer canopy. Distribution of fruit clusters in citrus trees was studied (Schertz and Brown, 1966) for six navel orange trees in Tulane County, California. An evolution of fruit clusters showed that 68 percent of the fruits were borne as separate fruits, 19 percent in clusters of two, and 7 percent in clusters of three. The remaining 6 percent had four through eleven fruits per cluster. 2.5 Image Segmentation Some of the earlier studies regarding fruit recognition were conducted for apple, citrus and tomatoes. Parrish and Goskel (1977) developed the earliest prototype for an apple harvester and studied the feasibility of harvesting methods based on pictorial pattern recognition and other artificial intelligence techniques. The prototype used a standard black-and-white camera, to acquire apple images, and a pattern recognition method to guide the fruit-harvesting robot. Slaughter and Harrell (1989) used a color

PAGE 23

12 camera to exploit high contrasting colors of orange to implement a color image-processing algorithm to distinguish orange from a typical image of a citrus grove. Whittaker et al. (1987) used fruit shape rather than the color information to detect tomatoes. This method could be used even in the presence of interferences caused by bright reflection and when fruits were shaded. They used circular Hough transform (CHT) to locate fruit or part of fruit in the image. CHT is a mathematical transform that uses angle matrix and range of radii to locate circles or part of circles in a digital image having discrete pixels. Before applying circular Hough transform, the image was passed through a Sobel gradient operator, which calculated the gradient magnitude and direction at each pixel point. Using this method, partially occluded fruits could also be detected. Slaughter and Harrell (1987; 1989) were involved in the development of a robotic fruit harvesting system and presented two approaches for detecting the fruit in an image based on color information. In the first approach (Slaughter and Harrell, 1987), the hue and saturation components of each pixel were used as features to segment an image by applying a traditional classification in a bi-dimensional feature space. The segmentation was carried out using a maximum and minimum threshold for each feature. Since color segmentation required some form of illumination control, they used an artificial lighting system. In the second approach (Slaughter and Harrell, 1989), a classification model was developed for discriminating oranges from the natural background of an orange grove using only color information. A Bayesian classifier was used in the RGB color space and fruits were segmented out from the background by checking whether they belonged to the

PAGE 24

13 fruit class or not. A reference table was created for various classes with the Bayesian classification technique. Casasent et al. (1996) used X-ray images to detect and segment multiple touching pistachio nuts placed on a conveyor trays for product inspection. New techniques were developed for identifying items irrespective of orientation by employing rotation-invariant filters to locate nuts. Watershed transformation was utilized for segmenting touching or overlapping nuts. Watershed transform is a region-based segmentation approach to partition the image into disjoint regions, such that the regions are homogeneous with respect to some property, such as gray value or texture. Advanced morphological image processing operations were used to produce gray scale images of only the nutmeat and to determine the amount of shell area filled by nutmeat. Shatadal et al. (1995) developed an algorithm to segment connected grain kernels in an image to assist in a machine vision-based-grain-grading experiment. A mathematical morphology-based algorithm was developed for disconnecting the connected kernel regions. The algorithm was used on wheat, durum wheat, barley, oats, and rye. The only limitation of the algorithm was that it failed when the connected kernels formed a relatively long isthmus between them. The majority of these works used charge coupled device (CCD) cameras to capture images and to use local or shape-based analysis to detect fruit. A CCD is a silicon chip whose surface is divided into light-sensitive pixels. When a photon (light particle) hits a pixel, it registers a tiny electric charge that can be counted. With large pixel arrays and high sensitivity, CCDs can create highresolution images under a variety of light conditions. A CCD camera incorporates a CCD to take such pictures. Systems based on

PAGE 25

14 local analysis such as intensity or color pixel classification allowed for rapid detection and were able to detect fruits at a specific maturity stage (i.e., fruits with a color different from the background). Systems based on shape analysis were independent of color, but their algorithms were more time consuming.

PAGE 26

CHAPTER 3 MATERIALS AND METHODS The objective of this chapter is to provide background about the hardware and basic software components used in the research. The chapter begins with a procedural outline of the experiment. A general overview of an encoder and a DGPS receiver is presented next. Then the chapter is divided into a description of the image acquisition section, an analysis of image in hue, a saturation and intensity (HSI) model, and development of the fruit-recognition algorithm. The complete control system and software is discussed lastly to provide background details for software development. 3.1 Overview of Experiment The yield mapping system was tested in a commercial grove (Conserve II), which was located near Winter Garden, Florida. The grove consisted of 48 plots and there were 24 trees in each plot, with Hamlin oranges on three rootstocks: Cleopatra mandarin (C.reticulata), Swingle citrumelo (Citrus paradisi Macf. x Poncirus trifoliata [L] Raf.) and Carrizo citrange (Citrus sinensis x Poncirus trifoliate). The Cleopatra mandarin rootstock is known to bear fruits more slowly than other rootstocks. The trees were 17 year old. Each variety of rootstock was planted in 16 plots. Out of every plot, two trees for yield measurement were selected for the study. The two trees were named as Yield trees and were of same size. These two trees were always side by side and planted in a single row. The distribution of plots and the locations of sampling trees are shown in Figure 3-1. 15

PAGE 27

16 Figure 3-1. Mapping of trees in the citrus grove. Two trees designated for hand harvesting are shown inside the red rectangular box for each plot.

PAGE 28

17 A 4x4 truck was used for driving inside the grove. The complete setup, consisting of a desktop computer, a control box for an encoder and a camera, and a DGPS receiver, was kept on the rear of the truck, Figure 3-2. A metal frame attached to the rear of the truck was used to carry a generator, the source of power supply for the entire setup and this metal frame moved in tandem with the truck. The encoder was attached to the wheel of the metal frame (Figure 3-4). A camera (model: FCB-EX780S, Sony, New York, NY) and a DGPS receiver (model: AgGPS 132, Trimble, Sunnyvale, CA) were attached to a metal pole that was supported to the tailgate of the truck. The metal pole was 5.2 m high and the camera was 4.9 m above the ground. The camera was placed at 45-degree angle relative to the ground with the intention to cover a maximum section of the tree canopy. DGPS Camera Metal pole Flag pole Generator Metal pole support strap DGPS Camera Metal pole Flag pole Generator Metal pole support strap Figure 3-2. Experimental setup.

PAGE 29

18 3.2 System Hardware 3.2.1 Color Vision Hardware The CCD camera was used to take pictures in the grove. The camera had a built-in image stabilizer utility, which minimized the appearances of shaky images due to low frequency vibrations. RS-233 drivers & receiversUARTVIAOpto-couplerLowpassfilterVideo transfer memory (4 Mbytes)12:1 MUXGainDecoder Host 32-bit PCI busRXTX/RTS/CTSTrigger/////V_IN1V_IN2V_IN3V_IN12...(Aux64322422 VMChannel To DB-44 Connector for PCI RS-233 drivers & receiversUARTVIAOpto-couplerLowpassfilterVideo transfer memory (4 Mbytes)12:1 MUXGainDecoder Host 32-bit PCI busRXTX/RTS/CTSTrigger/////V_IN1V_IN2V_IN3V_IN12...(Aux64322422 VMChannel To DB-44 Connector for PCI Figure 3-3. Components of the imaging board. The camera was powered by an imaging board. Camera features such as imaging mode, shutter speed, and brightness were adjusted through serial communication. Video signals from the camera were fed to the computer through a frame grabber (model: Meteor-II, Matrox, Quebec, Canada) that supports capture from multiple standard analog video

PAGE 30

19 sources and provide real-time image transfer to system. Various components of the imaging board are shown in Figure 3-3. The imaging board features twelve software selectable input channels to switch between twelve composite video signals. It accepts an external trigger output and can operate in next valid frame/field mode. The imaging board provides an auxiliary power supply unit that could be used to provide power for a camera. The imaging board features 4 Mbytes of video transfer memory for temporary frame storage to minimize loss of data during long bus-access latencies found in heavily loaded computer systems. 3.2.2 DGPS Receiver and Encoder A Differential Global Positioning System (DGPS) receiver (model: AgGPS 132, Trimble Inc., Sunnyvale, CA) with a Coast Guard Beacon antenna was used to locate the position of an image in the grove. The DGPS receiver was configured for a 5 Hz refreshing rate. An incremental encoder (model: CI20, Stegmann, Dayton, OH) was attached to the small frame to measure the amount of distance traveled in the grove (Figure 3-4). The encoder has a resolution of 4048 pulses per revolution. A 12-bit multifunction PCMCIA I/O board (model: DAQCard-AI-16E-4, National Instruments, Austin, TX) was used to acquire and count pulses from the encoder. The encoder was calibrated before the actual experiment in the citrus grove. Pulses from the encoder were read for known distances three times and the average of those values were taken to be the encoder output for that particular distance. Two channels were read from the encoder and the phase between the channels helped to identify whether the wheel was moving in the forward direction or in reverse direction.

PAGE 31

20 Metal frame Generator Encoder Metal frame Generator Encoder Figure 3-4. Encoder attached to the wheel of the metal frame. 3.3 Image Acquisition For developing a citrus fruit recognition algorithm, images were taken in stationary mode using an analog camera (model: FCB-EX780S, Sony, New York, NY) with 640 x 480 pixels. A total of 346 images were taken during the end of the citrus harvesting season over two days during the last week of December, 2003 and the first week of January, 2004. The images were taken in natural outdoor lighting condition. Brightness and shutter speed were adjusted for each plot before acquiring images. During the experiment, shutter speed was varied between 1/1000 to 1/15 sec. Higher shutter speeds were required during bright daylight condition and lower shutter speeds were useful during late afternoon to obtain good images with approximately unvarying brightness.

PAGE 32

21 Images were aligned over the trees by aligning the first image with a yield tree using a flagpole. The truck was moved back and forth so that the camera field of view was aligned with the flagpole. Subsequent non-overlapping images were obtained using the encoder. The encoder was calibrated and programmed to prompt the user on reaching the next location for taking non-overlapping image. At this location the truck was stopped for a brief period and an image was taken. Images of most of the tree canopy were acquired by driving around the trees on both sides. Images were processed on a Windows based system with a 750 MHz Pentium processor. 3.4 Image Analysis using HSI Color Model Color is one of the most important properties that humans use to discriminate between objects and to encode functionality. For example, sky is blue, citrus fruit is orange, and leaf is green. An objects color comes from the interaction of light waves with electrons in the object matter (Nassau, 1980). The colors that human beings identify in an object are based on the nature of the light reflected from the object surface. For example, a red apple reflects light from wavelengths centered around 700 nm ranges, while absorbing most of the energy at other wavelengths. An object that reflects light in the entire visible spectrum equally appears white in color. The purpose of a color model is to define a standard specification for specifying color in some generally accepted way. For instance, the red, green, and blue (RGB) color model is used in hardware applications like PC monitors, cameras and scanners; the cyan, magenta and yellow (CMY) color model is used in color printers; and the luminance, in-phase and quadrature (YIQ) model is used in television broadcasts. The most commonly used color models for image processing are RGB and HSI models. In essence, a color

PAGE 33

22 model is a specification of a 3-D coordinate system and a subspace within the system where each color is represented by a single point (Gonzalez and Woods, 1992). The current implemented system is using HSI as the color space. The main reason behind the selection of HSI model is that it correlates well with human color perception. Hue is defined as color attribute that describes pure color (pure red or green etc.), whereas saturation gives a measure of the degree to which light is added with a hue component. However, as all images from the camera were encoded in composite video format, it was converted to RGB tri-stimulus format in the imaging board. Then the image was converted later to HSI equivalent in software for further analysis. A composite video signal contains all of the brightness, color, and timing information for the picture. In order for the color information to be combined with the brightness and timing, information must be encoded. There are three main color-encoding systems in use throughout the world: National Television System Committee (NTSC), Phase Alternation by Line (PAL) and Systeme Electronique Couleur Avec Memoire (SECAM). The camera utilized for this experiment used NTSC format. 3.5 Development of the Fruit Counting Algorithm The steps in the fruit recognition algorithm are to identify fruits from an image and process the results to remove noise and to improve precision in counting the number of fruit. Segmentation or binarization is an image-processing step used to separate objects of interest from background. In this research, the object of interest was a citrus fruit and the background included citrus leaves and branches. The simplest way to segment an image is by a gray level threshold or global threshold. This operation requires that the object of interest and backgrounds have different levels of brightness or completely different colors. Unfortunately, the fruit portion, the leaf portion, and the background are not easily

PAGE 34

23 differentiated using this method because the gray level histogram or color histograms of these features are not unimodal. Creating a gray level histogram, which is bimodal, would make the selection of an optimum threshold easier and this could also be automated. The selection of threshold for identifying citrus fruits was made using HSI color plane. Color characteristics of the images were analyzed in RGB (red, green, and blue) and HSI (hue, saturation, and intensity) color spaces. To develop a system to identify and count citrus fruit in an image, various objects in typical citrus grove scene should be collected and analyzed. In the later stage when the system was developed, it should be tested on similar images to verify and compare the performance of the proposed system. For these reasons, the images were divided into calibration and validation data sets. The pixels were classified into three classes: C (citrus fruits), L (leaf), and K (background). The RGB and HSI values of each pixel were obtained using a program written in VC++ (Microsoft Corporation, Redmond) with the Matrox Imaging Library (Matrox Imaging, Quebec, Canada) for three different classes. The pixel values were stored in separate text files for different classes and processed using Microsoft Excel (Microsoft Corporation, Redmond). Pixels in the three classes C, L and K were chosen manually by inspecting from the images in the calibration image set. Pixels were plotted for various combinations of color components in the RGB and HSI color space. Binarization was carried out in the color plane containing a clear distinction between the fruit and background, resulting in white pixels representing the fruit and black pixels for all other classes. The field of mathematical morphology contributes a wide range of operators to image processing that are particularly useful for the analysis of binary images. Common

PAGE 35

24 usages include edge detection, noise removal, image enhancement and image segmentation. In order to process binary images, the following operations were performed in this research: erosion, dilation and closing. Erosion shrinks the boundary of a binary image to be smaller, while dilation expands the boundary to be larger depending on a structuring kernel. Closing is a morphological operation, which consists of applying dilation and, immediately, erosion to fill in small background holes in images. Due to the dissimilarity in illumination between the images and the presence of some dead leaves, certain pixels were falsely classified as fruits. Using the set of calibration images, immediately after binarization, a threshold was applied based on area of the selected features to remove false detections. In some of the fruits detected, a few pixels mostly at the center of the fruit were classified as background due to very high illumination. The kernel sizes for filling the gaps were determined by applying kernels of various sizes and of various orders over the calibration images. From this trial, the order of the erosion and dilation and the optimum kernel size was selected for the algorithm. Using the set of calibration images, a closing operation with structuring kernel (5 x 5) was applied to fill these gaps. These image-processing steps are shown in Figure 3-5. Citrus fruits were identified using blob analysis and in this method, connected fruit pixels were treated as a single fruit. Fruit features such as area was extracted for all fruits and stored in a text file for post processing. Fruit area is defined as the number of pixels in a connected region and compactness value is derived from the perimeter and area for each blob. Based on the average size of a fruit, a threshold was used to identify and consider cluster of fruits while determining the total number of fruits in an image. With these

PAGE 36

25 features, the size of the fruit could be calculated, using a sensor to measure the distance between the camera and the tree, such as an ultrasonic sensor. Binarize using hue-saturation plane Acquire image Convert RGB to HSI Extract features of fruits Erosion (5 x 5) Threshold using area Estimation of Fruit clusters Count number of citrus fruits Dilation (5 x 5) Figure 3-5. Image processing steps of the fruit counting algorithm. 3.6 Image Processing Time Processing time is a major concern in a real-time machine vision application. A 750 MHz Pentium processor was used to process an image and the processing time of each image-processing step was measured using the computer clock. Each step was measured 10 times and was averaged from 10 executions. Although images were converted from composite video signal to RGB model in the imaging board, substantial time was required for converting the images from RGB to HSI color model. Each image had 640 x 480 pixels and every pixel had to undergo many comparisons based on its relative position in the hue-sat color plane to be classified to one of the three classes. To reduce the processing time, every pixel was compared with background category initially since the number of pixels for background was the maximum in the hue-sat color plane and the percentage of fruit pixels in an image was less compared to the background. This optimization considerably reduced the time for processing an image.

PAGE 37

26 3.7 Experimental Procedure A schematic diagram of the overall experiment setup is shown in Figure 3-6. A DGPS receiver and camera control interface were connected to the computer through the serial ports available in the computer. A video signal from the camera was fed into the imaging board and data from the encoder was fed into the counter input channel of the PCMCIA I/O board. Computer 12V Power Supply Serial Port #1 Serial Port #2 Imaging Board PCMCIA Board I/O DGPS Receiver Camera Encoder Figure 3-6. Schematic diagram of the overall citrus yield mapping system. Once the camera was mounted on the top of the pole, the camera field of view was measured and the width and height of the imaging scene were calculated. Based on the width of the image scene, the encoder was programmed to prompt the user when the required distance has been traveled from the current imaging location to take subsequent non-overlapping image. After aligning the first image with the tree, the truck was driven very slowly (2.2 m/s) and the pulses from the encoder were read continuously with a 20 ms time interval to measure the distance traveled from the previous imaging location.

PAGE 38

27 Start Stop Grab first image Read encoder Next Imaging Location Grab next image Record DGPS data Reset counter for encoder User termination Store image and data in disk drive YesYes N o N o Figure 3-7. Algorithm for field-testing of the machine vision system.

PAGE 39

28 After the required distance had been traveled, subsequent non-overlapping images were grabbed along with position information from the DGPS receiver. Immediately the encoder counter value was reset to zero so that the relative distance from the new imaging position could be used as reference for the next image. The algorithm continued until the user terminated it. Imaging sequence in the field trial is shown in Figure 3-7. Images were taken over two days in the field. The height of the camera was adjusted only once at the beginning of the day and remained at the same position throughout the day. The camera field of view was calculated on both imaging days and the encoder was programmed to reflect the current field of view settings. The experimental setup and encoder are shown in Figures 3-2 and 3-4. 3.7 Prediction of number of fruits/plot In the grove where the citrus yield mapping system was tested, two trees in each plot were designated for hand-harvesting. Fruit/plot were predicted based on three models using the following three variables: 1) Number of fruit estimated using fruit counting algorithm (NP fruits ) 2) Number of citrus pixels/plot estimated using fruit counting algorithm (NP pixels ) 3) Number of fruits/plot estimated using citrus pixels/plot data (NP fruits-pixels ) Images belonging to a same plot were grouped together and the number of fruits estimated by the fruit counting algorithm was summed up to give the number of fruit/plot estimates, using the following variables: NP fruits NP pixels and NP fruits-pixels NP fruits used the number of fruits identified in an image by the machine vision algorithm while NP pixels used the number of citrus pixels in an image. NP fruits-pixels used a relation between the actual size of the fruit and the size of the fruit in terms of pixels in an image. Relation between a pixel size and its corresponding actual size in the imaging

PAGE 40

29 scene was calculated. Actual size of an image with 640 x 480 pixels corresponded to an imaging scene of 1.67 m (5.5 ft) long and 1.26 m (4.12) ft high. As the average size of a fruit from each plot was known, area of a fruit in a plot was calculated in terms of pixels. Then, the number of fruits in a plot was determined from the total number of citrus pixels/plot using the following estimation, and was used to estimate the number of fruits in a plot. Since 640 x 480 pixel 2 = 1.67 x 1.26 m 2 = 2.1 m 2 = 2107970 mm 2 Thus, 1 mm 2 = 0.15 pixel 2 Let D be an average diameter of a fruit in a plot, A actual be an area of fruit in two-dimensional plane, and A image be an area of fruit in an image, and T be total number of citrus pixels in a plot, then A actual = 22D and A image = 0.15A actual Thus, NP fruits-pixels = imageAT = 22D 15.0TATactual Since T (from the image processing algorithm) and D (from the actual harvesting data) are known, NP fruits-pixels can be obtained. 3.8 Yield Prediction Model Citrus yield is calculated as number of citrus fruits per unit area. The distances between citrus trees in the grove were 3.05 m (10 ft) in-row and 6.1 m (20 ft) between-rows. For this particular experimental setup, yield is calculated as Y E = )( 2 x )( 1.6 x 05.32treesmNPpixelsfruits

PAGE 41

30 Y A = )( 2 x )( 1.6 x 05.32treesmNA where Y E = Estimated yield by the machine vision system, Y A = Actual yield by hand harvesting. Yield (Y image ) was calculated based on the number of fruits identified from the image. Each image covered 1.7 m of the canopy lengthwise and the distance from the camera to the canopy was 3.0 m. Y image was calculated as Y image = )( 7.1 x 0.32mNPpixelsfruits where NP = Number of fruits/plot estimated using the machine vision system. 3.9 Performance of the Fruit Counting Algorithm An algorithm was developed using 25 calibration images and tested on the remaining 329 validation images. In order to evaluate the performance of the algorithm, fruits counted by the fruit counting algorithm should have been compared with the actual number of fruits in the region covered in the image. Since it was very difficult to define the boundary of each image and count the number of fruits in the grove, the images were shown to three observers and the average of these three reading were taken as reference for the fruit counting algorithm. This arrangement was made for manual counting because there were variations in the total number of fruit perceived by human beings. Error Image (%) was defined as percentage error between the number of fruits counted by the machine vision algorithm and the average number of fruits counted manually. 100(%)ImageMCMCMVError where

PAGE 42

31 MV = number of fruits counted by the machine vision algorithm MC = average number of fruits counted manually 3.10 Performance of the Yield Prediction Model Images from each plot were grouped together and the number of fruits from each plot was compared with the actual number of fruits harvested from the respective plots. Half of the total plots were used as calibration data to develop a prediction model to estimate citrus yield. The model was tested on validation data set consisting of remaining plots. Error Plot (%) was defined as percentage error between the yield estimated by the machine vision algorithm and the actual yield by hand harvesting. 100(%)PlotAAEYYYError where Y E = Estimated yield by the machine vision system Y A = Actual yield by hand harvesting 3.11 Yield Variability based on Rootstock Variety Fruits per plot detail were used to check whether there was any correlation between the harvested yield and the rootstock variety (Cleopatra mandarin, Swingle citrumelo and Carrizo citrange). Means test was conducted on the yield data for 48 plots grouped by the three different rootstock varieties using SAS ANOVA PROC. The following is the SAS Program used for the means test of the yield data of different rootstocks. PROC ANOVA; CLASS CRTS; MODEL NF=CRTS;

PAGE 43

32 means CRTS / tukey lines; RUN; proc cluster DATA=ROOTSTOCK method=average std pseudo noeigen outtree=tree; ID CRTS; var NF; run; Variables used in the ANOVA PROC: CRTS: Citrus Rootstock variety NF: Number of fruits in a plot

PAGE 44

CHAPTER 4 RESULTS AND DISCUSSION This chapter begins with a summary regarding binarization and preprocessing of images. Then the chapter illustrates various steps involved in the fruit counting algorithm and compares the performance of the algorithm over 354 images with respect to the number of fruits counted manually. Processing time for various image-processing steps are explained and the chapter concludes with a description about the yield prediction model and compares its results with the actual harvested yield data. 4.1 Binarization Images were divided into calibration and validation data sets, Table 4-1. Using a program written with Matrox library in VC++, RGB & HSI components were collected for features by drawing a rectangle using a mouse. A program was used to collect RGB and HSI components of the three classes in an image, citrus (C), leaf (L) and background (K) from the calibration images and each class was stored separately in different text file. Histograms in all the color components for the three classes are shown in Figure 4-1 through 4-6. There was no distinct separation between citrus class and other classes in any of the individual color component except hue component. Table 4-1. Number of images used in calibration and validation data sets. Calibration Validation Total 25 329 354 33

PAGE 45

34 020040060080010001200140016001800050100150200250300Pixel ValueFrequency Citrus Leaf Background*+ Figure 4-1. Histogram of 25 calibration images in hue plane (* indicates 3898 pixels of a pixel value of 0 in citrus class,+ indicates 5606 pixels of a pixel value of 0 in background class). 0100200300400500600050100150200250300Pixel ValueFrequency Citrus Leaf Background*+# Figure 4-2. Histogram of 25 calibration images in saturation plane (* indicates 3872 pixels of a pixel value of 0 in citrus class,+ indicates 5604 pixels of a pixel value of 0 in background class, # indicates 2736 pixels of a pixel value of 254 in citrus class).

PAGE 46

35 020040060080010001200050100150200250300Pixel ValueFrequency Citrus Leaf Background* Figure 4-3. Histogram of 25 calibration images in luminance plane (* indicates 3356 pixels of a pixel value of 252 in citrus class). 020040060080010001200140016001800050100150200250300Pixel ValueFrequency Citrus Leaf Background Figure 4-4. Histogram of 25 calibration images in red plane.

PAGE 47

36 01002003004005006007008009001000050100150200250300Pixel ValueFrequency Citrus Leaf Background* Figure 4-5. Histogram of 25 calibration images in green plane (* indicates 3330 pixels of a pixel value of 252 in citrus class). 01002003004005006007008009001000050100150200250300Pixel ValueFrequency Citrus Leaf Background* Figure 4-6. Histogram of 25 calibration images in blue plane (* indicates 3319 pixels of a pixel value of 252 in citrus class). As a next step, gray level histograms were plotted in pairs of two color components and it was found that there existed a clear line of separation between the fruits and the

PAGE 48

37 background in hue-saturation color space (Figure 4-7). Gray level histograms for the three classes in hueluminance and redgreen color plane are shown in Figure 4-8 and 4-9. Although it seemed there might be a distinction between the fruits and background in red-green color plane, there were numerous false detections when different thresholds were tested on the validation images. The main reason for so many false detections was because of high contrast and brightness level in the image, which tended to make leaves and background objects white in color and were classified as citrus class. The threshold in hue-saturation color plane was carefully chosen in a conservative approach after many trials have been conducted over the calibration images. The luminance component was added to the threshold to make it less dependent on the brightness level of the image during binarization. The pixel distribution for various classes in the calibration images is shown in Table 4-2. Although only 58% of citrus class was captured inside the threshold, the binarization scheme was found to work very well with the validation images. The main reason behind this threshold was that the threshold contained 0% of background and 0.03% of leaves. Since majority portion of an image consisted of leaves and background, this binarization scheme performed well and there were some underestimation but very small number of overestimation due to the conservative binarization scheme. Table 4-2. Pixel distribution for citrus, leaf and background classes for 25 images in HSI color plane. Citrus class Leaf class Background class Pixel Category Number of pixels Percentage Number of pixels Percentage Number of pixels Percentage Inside threshold 15875 58.1% 68 0.03% 0 0% Outside threshold 11438 41.9% 23347 99.7% 8165 100%

PAGE 49

38 Mathematical representation of the two thresholds is as follows: For any pixel, if (Hue is between 4 43, Saturation is between 50 250 and Luminance is between 60 230 ) { // Calculate the position of the threshold line at that specific hue value S1 = (4.83) Hue 53.1 //check whether the pixel is within the threshold if (Saturation > S1 ) Pixel = Citrus //Pixel was inside threshold and marked as 255 // else Pixel = Background //Pixel was outside threshold and marked as 0 // } else { Pixel = Background //Pixel was outside threshold and marked as 0 // }

PAGE 50

39 050100150200250300050100150200250300Hue (gray level)Saturation (gray level) Citrus Leaf Background Threshold Figure 4-7. Pixel distribution in 25 calibration images in hue-saturation plane. 050100150200250300050100150200250300Hue (gray level)Luminance (gray level) Citrus Leaf Background Threshold Figure 4-8. Pixel distribution in 25 calibration images in hue-luminance plane.

PAGE 51

40 050100150200250300050100150200250300Red (gray level)Green (gray level) Citrus Leaf Back g round Figure 4-9. Pixel distribution in 25 calibration images in red-green plane. Thresholds shown in Figures 4-7 and 4-8 were used for the binarization step. The algorithm classified a pixel as citrus fruit if it fell inside the thresholds; otherwise it was classified as a background class. An example of image processing steps for a typical citrus grove image is shown in Figure 4-10. This image is used as an example to explain various steps involved in the implementation of the fruit counting algorithm. Figure 4-10 (a) shows a sample color image in the validation data set. Fruits were extracted by applying binarization on the sample color image in HSI color plane. The binarized image is shown in Figure 4-10 (b). 4.2 Preprocessing The binarized images contained noise mainly due to the little overlap of the leaf class with the citrus class in the hue-saturation color plane. By applying a threshold of

PAGE 52

41 100 pixels based on area of the extracted features, these noises were removed from the images in Figure 4-10 (c). The threshold was selected based on the area of the noise pattern in the calibration images. (a) Color image. (b) Binarized image. (c) After removing noise. (d) After filling gaps. Figure 4-10. Image processing steps of a typical citrus grove scene. In the above processed image, there were cases in which a single fruit occluded by small leaves were counted as more than one fruit. To overcome this problem, a set of dilation and erosion with a kernel size of 5x5 pixels was applied to the images, resulting in the final processed image as shown in Figure 4-10 (d). These images could then be used to count the number of citrus fruits by the algorithm.

PAGE 53

42 Citrus fruits were identified using blob analysis and in this method, connected fruit pixels were treated as a single blob, and the total number of blobs gave the number of fruits in the image. Various features such as perimeter, area, horizontal width, and vertical height of citrus fruit were calculated for each blob and were stored separately in a text file. It should be noted that there were very few over estimations and many under estimations by the algorithm. The main reasons for overestimation were: When a single fruit was hidden by many leaves and the separation between the small blobs was more than 25 (5x5) pixels, they were counted as different fruits. Small fruits were not clearly visible in manual counting, however they were counted as fruits by the machine vision algorithm. In some images, there were many fruits hidden in dark background. Reasons for underestimation were: When the visible portion of a fruit was very small, it would have been removed in the fruit counting algorithm, since a threshold in area was carried out to remove noise. In some cases, many fruit clusters were counted as single fruit in the estimation by the machine vision algorithm due to connectivity. It was found that the areas of the fruit clusters were relatively large in size compared to other single fruits. Hence modifications were made in the fruit counting algorithm to rectify for the underestimation problem. Since it was found from the calibration images that there were a few fruits completely visible and all the remaining

PAGE 54

43 fruits were mostly hidden by leaves, average size of a fruit in an image was calculated based on the five largest fruits in an image. If the average area was less than 200 pixels, or if the total number of fruits in an image was less than 10, then it was decided to end the fruit counting procedure. This was because it would be difficult to identify fruit clusters when the leaves might have hidden all the fruits or when the imaging scene would have been at a large distance from the camera. Otherwise the following fruit cluster estimation module was conducted. A threshold was calculated based on the average area of fruit and if the fruit area was more than the threshold, it was identified as a fruit cluster and counted as two fruit instead of one. Fruit clusters were counted only as two instead of many fruits because of the difficulty in defining a threshold in area for multiple fruits. This would introduce potential error in the fruit counting algorithm by under estimating the number of fruits in an image. The threshold was selected with trial and error using the calibration images. Table 4-3. Threshold for identification of fruit clusters. Average size of fruits (pixel 2 ) Threshold (pixel 2 ) 0 200 100,000 201 600 Calculated average size 601 1,300 800 > 1,301 1,200 The conditions for setting thresholds are shown in Table 4-3. For example, if average area of objects in an image is between 601 and 1300 pixel 2 and if area of an object in that image was more than threshold (800 pixel 2 ), then it would be considered as fruit cluster and counted as two fruits instead of one. The identification of fruit clusters was not invoked by setting the threshold equivalent to 100,000 pixels. This underestimation problem could be solved once a separation algorithm such as the

PAGE 55

44 Watershed method is developed. However, deciding on a suitable threshold for a watershed algorithm would be difficult, since the fruits are hidden by leaves in an irregular pattern and watershed algorithm may result in over fragmentation of normal fruits, resulting in overestimation. The fruit counting algorithm was applied to the validation set of 329 images and the results are tabulated in Table 4-4. The percentage error was as low as 0% and as high as 100% in cases where there were 1 or 2 fruits and the algorithm identified none. The mean absolute Error Image was determined to be 29.33% for all the validation images. The main reason for this high error was due to the fact that there were many fruits that were very small and clear to the human eye, while the algorithm would have treated them as noise and left them while counting the fruits. A regression analysis was conducted between the number of fruits by manual counting and the number of fruits counted by the fruit counting algorithm for 329 validation images, Figure 4-11. The R 2 value for the regression analysis was 0.79. Table 4-4. Performance comparison for 329 validation images. Observer Observer Image name 1 2 3 Average Standard deviation Fruit counting algorithm Error(%) Image name 1 2 3 Average Standard deviation Fruit counting algorithm Error(%) 100a-0 16 22 23 20 4 16 -21.3 117b-0 21 25 26 24 3 16 -33.3 100a-1 25 39 39 34 8 25 -27.2 117b-1 16 16 16 16 0 13 -18.8 100a-2 8 9 8 8 1 6 -28.0 117b-2 22 26 25 24 2 21 -13.7 100a-3r 5 6 6 6 1 3 -47.1 117b-3r 15 18 19 17 2 13 -25.0 100b-0 11 18 18 16 4 8 -48.9 118a-0 24 35 33 31 6 21 -31.5 100b-1 27 33 35 32 4 19 -40.0 118a-1 40 51 47 46 6 27 -41.3 100b-2 4 8 9 7 3 3 -57.1 118a-2 32 46 44 41 8 28 -31.1 100b-3 15 23 23 20 5 10 -50.8 118a-3r 18 21 23 21 3 13 -37.1 101a-0 17 20 19 19 2 9 -51.8 118b-0 40 53 41 45 7 30 -32.8 101a-1 22 27 26 25 3 18 -28.0 118b-1 32 45 40 39 7 26 -33.3 101a-2 32 36 37 35 3 18 -48.6 118b-2 27 42 40 36 8 26 -28.4 101a-3r 13 18 19 17 3 6 -64.0 118b-3r 18 16 17 17 1 14 -17.6 101b-0 20 27 19 22 4 10 -54.5 119a-0 23 25 24 24 1 15 -37.5 101b-1 44 46 40 43 3 22 -49.2 119a-1 36 40 37 38 2 29 -23.0 101b-3 30 24 26 27 3 21 -21.3 119a-2 20 19 19 19 1 10 -48.3

PAGE 56

45 Table 4-4. Continued. Observer Observer Image name 1 2 3 Average Standard deviation Fruit counting algorithm Error(%) Image name 1 2 3 Average Standard deviation Fruit counting algorithm Error(%) 14b-1 28 31 34 31 3 19 -38.7 31a-0 19 29 26 25 5 26 5.4 14b-2 40 48 52 47 6 22 -52.9 31a-2 3 4 4 4 1 3 -18.2 14b-3 24 31 30 28 4 22 -22.4 31a-3r 7 10 9 9 2 5 -42.3 15a-0 18 24 22 21 3 16 -25.0 31b-0 14 19 17 17 3 13 -22.0 15a-1 26 25 24 25 1 17 -32.0 31b-1 18 26 24 23 4 8 -64.7 15a-2 15 23 21 20 4 14 -28.8 31b-2 38 51 48 46 7 33 -27.7 15b-0 30 34 33 32 2 28 -13.4 31b-3r 10 9 9 9 1 5 -46.4 15b-2 24 48 45 39 13 19 -51.3 33a-0 17 22 23 21 3 10 -51.6 16a-1 16 18 19 18 2 16 -9.4 33a-1 7 10 10 9 2 2 -77.8 16a-2 20 39 40 33 11 17 -48.5 33a-2 1 5 6 4 3 0 -100.0 16a-3 18 27 25 23 5 19 -18.6 33a-3r 11 10 10 10 1 4 -61.3 16b-0 16 28 29 24 7 10 -58.9 33b-0 22 29 30 27 4 10 -63.0 16b-1 27 29 29 28 1 17 -40.0 33b-1 11 14 16 14 3 6 -56.1 26a-0 18 24 22 21 3 20 -6.2 33b-2 32 35 33 33 2 18 -46.0 26a-1 10 15 14 13 3 9 -30.8 35a-0 23 28 25 25 3 21 -17.1 26a-2 14 20 20 18 3 17 -5.6 35a-1 36 36 36 36 0 35 -2.8 26a-3 16 16 16 16 0 10 -37.5 35a-2 7 9 9 8 1 4 -52.0 26b-0 3 6 6 5 2 3 -40.0 35b-0 7 8 8 8 1 7 -8.7 26b-1 8 11 11 10 2 6 -40.0 35b-1 10 16 13 13 3 8 -38.5 26b-2 9 31 34 25 14 8 -67.6 35b-2 17 27 22 22 5 13 -40.9 27a-0 8 7 7 7 1 7 -4.5 35b-3r 4 7 7 6 2 0 -100.0 27a-1 18 24 21 21 3 20 -4.8 36a-0 26 28 21 25 4 15 -40.0 27a-2 16 16 14 15 1 17 10.9 36a-1 13 12 13 13 1 9 -28.9 27b-0 10 10 10 10 0 9 -10.0 36a-2 14 13 15 14 1 14 0.0 27b-1 9 9 9 9 0 8 -11.1 36a-3r 16 20 15 17 3 14 -17.6 27b-2 21 23 20 21 2 17 -20.3 36b-0 18 19 14 17 3 16 -5.9 27b-3 10 11 11 11 1 8 -25.0 36b-1 19 21 16 19 3 17 -8.9 28a-0 16 21 22 20 3 9 -54.2 36b-2 34 29 28 30 3 25 -17.6 28a-1 20 34 34 29 8 24 -18.2 41a-0 21 29 27 26 4 25 -2.6 28a-2 15 25 22 21 5 16 -22.6 41a-1 20 22 25 22 3 21 -6.0 28a-3 16 31 29 25 8 13 -48.7 41a-2 27 26 23 25 2 23 -9.2 28b-0 32 48 38 39 8 28 -28.8 41a-3 11 13 12 12 1 11 -8.3 28b-1 18 28 25 24 5 19 -19.7 41b-0 17 29 22 23 6 22 -2.9 28b-2 15 26 20 20 6 15 -26.2 41b-1 18 23 17 19 3 18 -6.9 29a-0 4 6 6 5 1 7 31.3 41b-2 15 26 18 20 6 21 6.8 29a-1 4 5 5 5 1 3 -35.7 42a-1 17 24 19 20 4 18 -10.0 29a-2 13 12 11 12 1 10 -16.7 42a-2 17 19 19 18 1 19 3.6 29a-3 10 9 9 9 1 14 50.0 42a-3 15 17 15 16 1 15 -4.3 29b-0 14 15 13 14 1 8 -42.9 82a-2 23 26 24 24 2 23 -5.5 29b-1 20 21 19 20 1 9 -55.0 82a-3 22 27 24 24 3 20 -17.8 29b-2 4 7 7 6 2 3 -50.0 83a-0 21 20 25 22 3 15 -31.8 30a-0 15 19 19 18 2 15 -15.1 83a-1 18 13 13 15 3 18 22.7 30a-1 19 23 22 21 2 16 -25.0 83a-2 32 30 33 32 2 25 -21.1 30a-2 42 61 52 52 10 35 -32.3 83a-3 43 41 39 41 2 31 -24.4 30b-0 28 37 33 33 5 26 -20.4 84a-1 22 20 24 22 2 22 0.0 30b-1 22 32 29 28 5 24 -13.3 84a-2 16 16 17 16 1 10 -38.8

PAGE 57

46 Table 4-4. Continued. Image name Observer Observer Fruit counting algorithm Error(%) Image name Fruit counting algorithm Error(%) 1 2 3 Average Standard deviation Standard deviation 1 2 3 Average 42b-0 11 13 10 11 2 10 -11.8 5a-1 25 28 26 26 2 24 -8.9 42b-1 15 13 16 15 2 15 2.3 20 24 20 21 19 -10.9 42b-2 24 21 21 4 -3.2 5a-3r 2 3 3 1 3 42b-3 19 26 24 4 24 1.4 5b-0 19 18 18 1 6 -67.3 31 35 30 32 31 -3.1 5b-1 30 28 31 4 -26.6 43b-0 12 14 14 2 14 5b-2 8 23 16 8 4 -75.0 43b-1 20 24 25 3 19 -17.4 23 24 21 23 8 -64.7 43b-2 10 7 8 2 -4.0 61a-0 4 5a-2 2 17 20 3 12.5 26 18 43a-1 3 36 23 16 0.0 17 23 5b-3 2 8 8 4 4 4 0 2 -50.0 43b-3 14 21 17 17 4 -13.5 7 6 6 1 15 61a-1 5 1 -83.3 53a-0 16 21 24 20 4 19 -6.6 61a-2 8 4 4 5 2 4 -25.0 53a-2 5 5 5 1 7.1 61a-3r 8 6 7 1 3 4 5 6 -55.0 53a-3 6 7 6 6 1 6 -5.3 61b-0 11 10 10 10 1 10 -3.2 53b-0 11 9 11 10 1 8 -22.6 61b-1 9 10 10 10 1 7 -27.6 53b-1 13 18 15 15 3 13 -15.2 61b-2 10 12 10 11 1 5 -53.1 53b-2 7 7 5 6 1 5 -21.1 63a-0 9 15 13 12 3 5 -59.5 53b-3r 2 2 2 2 0 1 -50.0 63a-1 7 7 7 7 0 5 -28.6 54a-0 4 6 5 5 1 5 0.0 63a-2 5 5 5 5 0 1 -80.0 54a-1 20 26 24 23 3 22 -5.7 63a-3r 7 8 8 8 1 5 -34.8 54a-2 19 27 22 23 4 20 -11.8 63b-1 9 13 11 11 2 5 -54.5 54a-3 13 18 15 15 3 15 -2.2 63b-2 12 17 15 15 3 14 -4.5 54b-0 19 15 17 17 2 16 -5.9 63b-3 2 2 2 2 0 1 -50.0 54b-1 14 18 15 16 2 15 -4.3 64a-0 26 30 27 28 2 14 -49.4 54b-2 14 18 21 18 4 18 1.9 64a-1 23 31 27 27 4 14 -48.1 54b-3 21 25 18 21 4 22 3.1 64a-2 19 23 21 21 2 7 -66.7 56a-0 5 7 7 6 1 5 -21.1 64a-3r 9 14 14 12 3 6 -51.4 56a-1 13 13 11 12 1 10 -18.9 64b-0 16 26 22 21 5 10 -53.1 56a-2 8 11 9 9 2 7 -25.0 64b-1 30 29 31 30 1 19 -36.7 56a-3 11 14 12 12 2 18 45.9 64b-2 26 29 32 29 3 16 -44.8 56b-0 25 30 26 27 3 28 3.7 73a-0 24 25 24 24 1 19 -21.9 56b-1 9 12 10 10 2 8 -22.6 73a-2 3 3 3 3 0 2 -33.3 56b-2 14 17 15 15 2 10 -34.8 73a-3 9 7 9 8 1 6 -28.0 56b-3 5 5 5 5 0 5 0.0 73b-0 7 5 5 6 1 5 -11.8 5a-0 24 23 32 26 5 24 -8.9 73b-1 6 6 6 6 0 5 -16.7 84b-0 15 14 13 14 1 10 -28.6 73b-2 4 4 4 4 0 3 -25.0 84b-1 19 19 17 18 1 9 -50.9 73b-3 26 25 23 25 2 19 -23.0 84b-2 20 22 21 21 1 18 -14.3 74a-0 23 22 25 23 2 17 -27.1 84b-3 24 23 24 24 1 19 -19.7 74a-1 31 28 28 29 2 23 -20.7 8a-0 10 9 7 9 2 9 3.8 74a-2 16 13 15 15 2 8 -45.5 8a-1 16 17 15 16 1 9 -43.8 74a-3 23 22 25 23 2 19 -18.6 8a-2 19 17 20 19 2 10 -46.4 74b-0 8 8 9 8 1 6 -28.0 8a-3 27 33 27 29 3 25 -13.8 74b-1 14 12 12 13 1 6 -52.6 74b-2 15 13 15 14 1 7 -51.2 8b-0 18 20 20 19 1 17 -12.1 74b-3r 5 6 6 6 1 3 -47.1 8b-2 10 12 13 12 2 8 -31.4 76a-0 28 29 31 29 2 16 -45.5 8b-3 11 11 12 11 1 8 -29.4 76a-1 27 29 30 29 2 19 -33.7 90a-0 7 6 7 7 1 4 -40.0 76a-2 32 28 31 30 2 22 -27.5 90a-1 11 12 10 11 1 10 -9.1

PAGE 58

47 Table 4-4. Continued. Observer Observer Image Name 1 2 3 Average Standard deviation Fruit counting algorithm Error(%) Image Name 1 2 3 Average Standard deviation Fruit counting algorithm Error(%) 76b-2 12 17 15 15 3 9 -38.6 90a-3 24 21 20 22 2 14 -35.4 77a-0 14 22 20 19 4 15 -19.6 90b-0 23 17 19 20 3 19 -3.4 77a-1 17 22 21 20 3 17 -15.0 90b-1 6 5 4 5 1 6 20.0 77a-2 16 23 21 20 4 14 -30.0 90b-2 4 2 3 3 1 3 0.0 77b-0 17 25 23 22 4 15 -30.8 90b-3 1 1 2 1 1 2 50.0 77b-1 26 24 21 24 3 25 5.6 91a-0 23 26 29 26 3 22 -15.4 77b-2 16 20 20 19 2 16 -14.3 91a-1 24 28 27 26 2 16 -39.2 77b-3 8 13 11 11 3 7 -34.4 91a-2 28 36 33 32 4 21 -35.1 79a-0 3 5 5 4 1 1 -76.9 91a-3 24 21 19 21 3 15 -29.7 79a-1 1 1 1 1 0 1 0.0 91b-0 27 25 23 25 2 20 -20.0 79a-2 6 5 5 5 1 2 -62.5 91b-2 28 29 27 28 1 21 -25.0 79a-3 8 7 7 7 1 8 9.1 91b-3 23 19 18 20 3 16 -20.0 79b-0 8 14 13 12 3 5 -57.1 92a-0 27 26 30 28 2 18 -34.9 79b-1 16 16 15 16 1 14 -10.6 92a-2 11 10 9 10 1 7 -30.0 79b-2 0 0 0 0 0 0 0.0 92a-3 18 17 16 17 1 13 -23.5 79b-3 4 4 4 4 0 3 -25.0 92b-0 21 21 21 21 0 25 19.0 7a-0 10 14 16 13 3 6 -55.0 92b-1 15 12 12 13 2 7 -46.2 7a-1 12 14 14 13 1 6 -55.0 92b-2 16 14 14 15 1 15 2.3 7a-3 2 0 2 1 1 0 -100.0 92b-3 14 14 14 14 0 9 -35.7 7b-0 3 2 2 2 1 1 -57.1 98a-1 21 39 32 31 9 22 -28.3 7b-1 2 2 2 2 0 0 -100.0 98a-2 15 33 26 25 9 15 -39.2 7b-2 5 4 6 5 1 3 -40.0 98a-3 31 45 42 39 7 31 -21.2 7b-3 6 9 10 8 2 3 -64.0 98b-0 38 51 49 46 7 34 -26.1 80a-0 18 20 18 19 1 21 12.5 98b-1 26 27 27 27 1 25 -6.3 80a-1 24 25 22 24 2 25 5.6 98b-2 17 22 20 20 3 15 -23.7 80a-2 21 31 28 27 5 21 -21.3 99a-0 33 30 29 31 2 24 -21.7 80a-3 11 16 14 14 3 9 -34.1 99a-2 30 30 30 30 0 21 -30.0 80b-0 23 28 25 25 3 15 -40.8 99a-3 27 34 32 31 4 30 -3.2 80b-1 31 42 38 37 6 30 -18.9 99b-0 30 34 32 32 2 27 -15.6 82a-1 17 16 17 17 1 13 -22.0 99b-1 17 15 14 15 2 9 -41.3 99b-2 24 27 25 25 2 24 -5.3 127b-3 10 11 11 11 1 6 -43.8 14a-0 15 25 24 21 6 13 -39.1 128a-0 22 28 30 27 4 15 -43.8 14a-1 19 21 25 22 3 12 -44.6 128a-1 20 25 25 23 3 17 -27.1 14a-2 24 27 28 26 2 17 -35.4 128a-2 22 25 28 25 3 21 -16.0 14a-3r 14 20 22 19 4 19 1.8 128a-3 15 17 17 16 1 17 4.1 14b-0 28 42 45 38 9 21 -45.2 128b-0 28 41 38 36 7 26 -27.1 76b-1 36 27 31 31 5 25 -20.2 90a-2 23 25 23 24 1 15 -36.6 30b-2 21 25 23 23 2 19 -17.4 84a-3 13 14 12 13 1 10 -23.1 128b-2 30 36 34 33 3 23 -31.0 128b-1 19 22 22 21 2 20 -4.8 102a-3 16 17 19 17 2 14 -19.2 119b-1 9 9 8 9 1 8 -7.7 102b-0 10 10 10 10 0 8 -20.0 119b-2 8 8 8 8 0 5 -37.5

PAGE 59

48 Table 4-4. Continued. Observer Observer Image Name 1 2 3 Average Standard deviation Fruit counting algorithm Error(%) Image Name 1 2 3 Average Standard deviation Fruit counting algorithm Error(%) 102b-1 10 10 10 10 0 8 -20.0 119b-3 18 21 19 19 2 14 -27.6 102b-2 19 18 18 18 1 17 -7.3 126a-0 10 8 8 9 1 7 -19.2 102b-3r 5 5 5 5 0 4 -20.0 126a-1 6 8 7 7 1 2 -71.4 104a-0 21 16 16 18 3 15 -15.1 126a-2 8 8 8 8 0 6 -25.0 104a-1 6 6 7 6 1 4 -36.8 126b-0 10 11 10 10 1 8 -22.6 104a-2 12 14 13 13 1 6 -53.8 126b-1 5 4 4 4 1 3 -30.8 104a-3r 13 12 13 13 1 8 -36.8 126b-2 9 7 7 8 1 3 -60.9 104b-0 10 10 10 10 0 13 30.0 126b-3 12 12 11 12 1 9 -22.9 104b-1 17 13 11 14 3 15 9.8 127a-0 23 29 26 26 3 16 -38.5 104b-3 9 12 14 12 3 14 20.0 127a-1 19 19 21 20 1 14 -28.8 117a-0 40 45 43 43 3 32 -25.0 127a-3 17 28 29 25 7 13 -47.3 117a-1 29 30 31 30 1 22 -26.7 127b-0 16 18 19 18 2 9 -49.1 117a-2 35 39 41 38 3 23 -40.0 127b-1 26 36 34 32 5 22 -31.3 117a-3 20 28 26 25 4 21 -14.9 127b-2 10 11 11 11 1 8 -25.0 102a-0 20 17 19 19 2 13 -30.4 119a-3 15 23 23 20 5 15 -26.2 102a-2 22 21 21 21 1 14 -34.4 119b-0 9 13 13 12 2 5 -57.1 128b-3r 15 21 21 19 3 12 -36.8 R2 = 0.7905101520253035404550550510152025303540455055Number of fruits counted by fruit counting algorithmNumber of fruits counted by human observation Citrus count by fruit counting algorithm Citrus count by regression analysis Figure 4-11. Regression analysis between the number of fruits counted by human observation and the number of fruits counted by the fruit counting algorithm.

PAGE 60

49 4.3 Execution time for the Algorithm Table 4-5 shows the average execution time of ten executions for each image-processing step. Conversion from RGB to HSI color space took a major segment of the execution time (65.9 %), since the algorithm needed to compute 640 x 480 pixels for an image. Binarization was carried out using software that consisted of checking hue, saturation, and luminance gray levels of each pixel value with the threshold and classifying accordingly. Table 4-5. Execution time for each image processing step. Image processing step Avg. Execution time (ms) Percent of total time (%) Conversion from RGB to HSI 78.8 65.9 Binarization 28.4 23.8 Remove noise 6.3 5.3 Fill gaps 3.8 3.2 Extract features and count fruits 2.1 1.8 Total time 119.5 100.0 The execution time for binarization could be reduced considerably, if it was carried out in hardware. The time for initialization was also measured and it was 80 ms. Average execution time including all steps was 119.5 ms. During real time field-testing, image acquisition time needs to be added. 4.4 Encoder Calibration The encoder was calibrated in the grove before the field-testing of the algorithm. The truck was driven for predefined distances three times and the average of the number of pulses generated by the encoder was used as the reference number of pulses for each distance. Data for the encoder calibration is shown in Table 4-6. A regression analysis was conducted over the pulses generated by the encoder for different trials, Figure 4-12. The R 2 value for the regression analysis was 0.99.

PAGE 61

50 D = 0.00804Np 0.02 Where D = Distance, Np = Number of pulses Table 4-6. Encoder Calibration. The number of pulses are shown for different trials. Distance (ft) Trial #1 Trial #2 Trial #3 Average 4 5060 4865 5316 5080.3 6 7074 7420 7764 7419.3 8 9461 10068 10165 9898 10 12821 12345 12460 12542 Figure 4-12. Regression analysis for encoder calibration. The linear equation from the regression analysis was used to measure the traveled distance from a specific location by checking the number of pulses from the encoder at regular interval. If the pulses from the encoder have reached the calculated value, then it could be concluded that the truck has traveled the required distance from the previous imaging location.

PAGE 62

51 4.5 Prediction of Number of Fruits/Plot In the grove where the citrus yield mapping system was tested, two trees in each plot were designated for hand-harvesting. Those trees were hand-harvested on Feb. 6, 2004 and number of fruits per plot (NA), average weight of fruit in a plot (AvgWt), minimum diameter of fruit in a plot (MinD), maximum diameter of fruit in a plot (MaxD), average diameter of fruit in a plot (AvgD), average boxes per tree per plot (AvgBT) and number of boxes per plot (NBP) were recorded, Table 4-7. This information was used to develop yield prediction models. Table 4-7. Actual harvested yield data. Plot NBP AvgBT AvgD (mm) MaxD (mm) MinD (mm) AvgWt (g) NA (Fruits/plot) 5 6.8 3.4 67.0 74.7 60.9 163.7 847.8 8 1.9 1.0 72.9 79.3 67.6 211.4 183.5 14 7.6 3.8 67.6 74.8 61.8 165.4 938.1 15 5.0 2.5 70.8 77.5 65.2 191.2 533.9 16 6.8 3.4 66.7 74.4 60.7 159.8 868.4 26 5.0 2.5 73.1 81.4 67.0 208.0 490.6 27 7.2 3.6 69.5 79.2 62.1 179.1 820.5 28 7.6 3.8 72.6 80.4 66.4 205.7 754.0 29 2.8 1.4 74.7 80.5 69.8 223.2 256.0 30 5.4 2.7 72.9 80.0 67.3 206.5 533.9 31 6.3 3.2 67.4 74.5 61.7 169.9 756.8 33 1.0 0.5 73.9 82.5 67.5 222.5 91.7 35 5.9 3.0 70.4 78.8 64.3 189.5 635.6 36 5.9 3.0 71.6 78.2 66.3 197.7 609.3 41 9.4 4.7 68.3 77.0 61.9 170.7 1124.3 42 7.6 3.8 74.1 81.2 68.3 219.5 706.7 43 6.3 3.2 67.5 75.3 61.9 167.5 767.6 53 6.3 3.2 75.3 82.2 69.4 233.2 551.5 54 8.1 4.1 73.1 81.7 66.2 209.6 789.0 56 7.6 3.8 74.3 82.5 68.2 214.7 722.5 61 1.5 0.8 76.1 82.2 71.0 238.8 128.2 63 5.4 2.7 69.0 77.3 62.3 177.5 621.0 64 6.3 3.2 70.6 78.9 64.1 187.3 686.7 73 7.2 3.6 68.0 77.4 61.5 172.7 851.2 74 8.1 4.1 74.5 81.3 68.9 227.4 727.0 76 8.1 4.1 69.0 76.1 63.2 179.2 922.7 77 5.9 3.0 71.4 77.8 66.2 196.4 613.3 79 5.4 2.7 75.7 82.4 70.2 235.1 468.9

PAGE 63

52 Table 4-7. Continued. Plot NBP AvgBT AvgD (mm) MaxD (mm) MinD (mm) AvgWg (g) NA (Fruits/plot) 119 6.3 3.2 66.1 73.8 60.0 157.0 819.1 126 4.6 2.3 69.4 76.9 63.5 182.6 514.3 127 7.6 3.8 71.0 79.3 64.6 193.5 801.5 128 7.2 3.6 70.5 79.4 63.6 195.1 753.2 80 9.4 4.7 66.0 76.1 58.9 154.9 1238.9 82 7.2 3.6 67.8 76.8 60.7 166.3 883.6 83 7.6 3.8 70.5 78.4 64.4 188.7 822.3 84 8.1 4.1 68.5 78.7 61.8 160.4 1030.6 90 5.9 3.0 70.4 77.2 64.9 188.8 638.0 91 9.0 4.5 70.3 77.0 64.8 191.5 959.2 92 6.3 3.2 74.8 81.7 69.1 233.8 550.0 98 6.8 3.4 71.0 77.5 65.8 190.1 730.2 99 6.3 3.2 69.7 76.8 64.0 182.8 703.5 100 4.1 2.1 77.1 82.7 72.3 245.2 341.3 101 7.6 3.8 72.1 79.3 66.3 198.4 782.1 102 6.8 3.4 68.6 76.9 62.6 172.0 807.1 104 6.8 3.4 75.2 81.9 69.9 227.4 610.3 117 9.0 4.5 68.1 76.5 62.4 170.6 1076.9 118 9.4 4.7 66.5 75.0 60.3 156.9 1222.5 There were cases in which images of entire plot was not taken. For example, images were taken only on west side of plots 82 and 83 since there were moisture sensors (tensiometers) installed on the other side of the plot. Also on plot 43, due to the operators mistake, images were taken only for one tree on one side while images were taken for both trees on the other side. Hence, these three plots 82, 83, and 43 were removed from the data analysis. Regression analysis was carried out between NA and the number of fruits/plot predicted by the fruit counting algorithm. It was found out that there was an outlier for plot 7 and subsequently it was removed from further data analysis. Fruit/plot were predicted based on three models using the following three variables: NP fruits NP pixels and NP fruits-pixels Fruits/plot data for the remaining 44 plots was sorted in ascending order based on these three variables. For each model, alternate plots were chosen and combined into two

PAGE 64

53 groups, and one was used as a calibration data set and the other was used as a validation data set, Table 4-8. Table 4-8. Number of plots in calibration and validation data sets to develop prediction models. Calibration Validation Total Plots 22 22 44 Fruit/plot data for the 44 plots were sorted based on these three variables. Then, alternate plots were chosen and combined into two groups, and one was used as the calibration data set and the other was used as validation data set so that data was evenly distributed throughout the entire range. Regression analysis was conducted between NA and the variables: NP fruits NP pixels and NP fruits-pixels for the calibration data set, and the results are shown in Figure 4-13, 4-14, and 4-15. A second-degree polynomial equation was estimated using Excel to fit the data between NA and predicted number of fruits by NP fruits in a plot for the calibration data set. The R 2 value for the regression analysis was 0.47. NP fruits = -0.0339(MV fruits ) 2 + 17.112MV fruits where MV fruits = number of fruits/plot counted by the fruit counting algorithm. A second-degree polynomial equation was estimated using Excel to fit the data between NA and predicted number of fruits by NP pixels in a plot for the calibration data set. The R 2 value for the regression analysis was 0.32. NP pixels = 0.00007(MV pixels ) 2 + 0.051MV pixels 79.69 where MV pixels = number of pixels/plot counted by the fruit counting algorithm

PAGE 65

54 R2 = 0.470500100015002000250002040608010012014016018Fruits/plot estimated by number of fruits using fruit counting algorithmActual harvested fruits/plot 0 Figure 4-13. Regression analysis between NA and NP fruits R2 = 0.3205001000150020002500300001000020000300004000050000600007000080000Fruits/plot estimated by number of citrus pixels using fruit counting algorithmActual harvested fruits/plot Figure 4-14. Regression analysis between NA and NP pixels

PAGE 66

55 A second-degree polynomial equation was estimated using Excel to fit the data between the actual number of fruits/plot and predicted by the fruit counting algorithm based on number of fruits in a plot. The R 2 value for the regression analysis was 0.46. NP fruits-pixels = -0.182(MV fruits-pixels ) 2 + 39.521MV fruits-pixels 369.62 where MV fruits-pixels = number of fruits estimated using citrus pixels/plot counted by the fruit counting algorithm. R2 = 0.460500100015002000250030000.020.040.060.080.0100.0120.0140.0Fruits/plot estimated based on fruits/plot using citrus pixels/plotActual harvested fruits/plot Figure 4-15. Regression analysis between NA and NP fruits-pixels 4.6 Yield Prediction Model A yield prediction model was developed using NP fruits-pixels variable. This variable was preferred because it used the average size of the fruit in the estimation of number of

PAGE 67

56 fruits in a plot. The R 2 value was also very high among the three approaches. The model was applied to the 22 plots in the validation data set to estimate the number of fruits/plot. The number of fruits/plot estimated using fruits based on pixels/plot from the machine vision algorithm was used in the yield prediction model. Yield predicted for the 22 plots in validation data set using NP fruits-pixels model is tabulated in Table 4-9. The percentage error was as low as 0.1% for plot 101 and as high as 214.8% for plot 33. The main cause for the high error rate was due to the fact that using a single camera, it was not possible to cover the entire citrus tree. Fruits that were inside the canopy would have been completely occluded by leaves in the images. Hence, the fruit counting algorithm was not able to identify these occluded fruits. Yield estimation model depends on the imaging scene of a particular tree. If large distribution of fruits on a particular tree were not captured on the image, the model would have predicted very less yield than the actual harvested yield. On the other hand, if a tree with low yield had fruits distributed over a dense region that was captured using the camera, then the model would have predicted more yield than the actual harvested yield. Since fruits were stretched throughout the tree canopy in irregular patterns, yield estimation based on portion of a tree was not very successful. The smallest number of images needed to estimate yield in a grove depends on the amount of variability present in the grove. If the variability is very high, then acquiring images of many trees is a best approach to predict the yield accurately. On the other hand if the yield were relatively uniform, then acquiring images of some trees would be a better option to predict yield of the grove accurately.

PAGE 68

57 Table 4-9. Performance comparison of the yield prediction model for 22 plots. Plot Number Actual yield (fruits/m 2 ) Predicted Yield (fruits/m 2 ) Error Plot (%) 33 4.9 15.5 214.8 79 25.2 18.9 -25.2 90 34.3 22.6 -34.1 63 33.4 26.7 -20.0 36 32.7 28.1 -14.3 27 44.1 33.3 -24.5 28 40.5 34.6 -14.7 26 26.4 36.7 39.3 99 37.8 37.4 -1.1 53 29.6 37.9 27.9 127 43.1 38.7 -10.2 41 60.4 39.6 -34.5 30 28.7 40.9 42.5 101 42.0 42.1 0.1 56 38.8 42.6 9.7 98 39.2 43.8 11.6 35 34.2 43.9 28.5 119 44.0 44.3 0.6 80 66.6 45.7 -31.4 91 51.6 47.5 -7.9 128 40.5 47.5 17.4 117 57.9 47.4 -18.1 Before the experiment, it was considered that keeping the camera 5.2 m high and focusing at 45 degree with respect to ground would cover majority of the tree canopy. But during the field-testing, it was found that the resolution of the image was not good with this setup. Hence, in order to take clear and high quality images, the camera lens was zoomed in by a factor of two, thus covering small percentage of the tree canopy. If multiple cameras were used to cover the majority of the tree canopy, then the model could be used to predict yield with improved accuracy. A regression analysis was conducted between the yield estimated by the yield prediction model and the actual yield for 22 plots, Figure 4-16. The R 2 value for the regression analysis was 0.46, RMSE was 45.1 fruits/meter 2 and CV was 70.42%.

PAGE 69

58 R2 = 0.460.010.020.030.040.050.060.070.00.010.020.030.040.050.060.070Yield prediction model (YE)Actual harvested yield (YA) .0 Figure 4-16. Regression analysis between yield prediction model and the actual harvested yield. ArcView software from Environmental Systems Research Institute (ESRI) of Redlands, CA, was used to create yield maps. Digital Orthographic Quarter-Quad (DOQQ) 1-meter resolution photograph was overlaid below the yield maps (Figure 4-17, 4-18 and 4-19). For qualitative analysis, the yield data was arbitrarily classified into three classes based on the yield distribution, Table 4-10. Actual and predicted yield for 22 validation plots classified into three classes is shown in Figure 4-17. Out of the 22 validation plots, there were seven false classifications in yield category between the actual and predicted yield. Table 4-10. Yield category for 22 plots. Yield (fruits/m 2 ) Yield category 0-20 Low 20.1 40 Medium 40.1-70 High

PAGE 70

59 Traveling speed based on the current processing time of 119 ms for an image of 1.67 m wide was determined to be 14.03 m/s (31.38 miles/hour). This would be the upper limit if the current system were implemented in moving mode in which images would be continuously acquired as the truck is driven in-between the tree rows in the citrus grove. Actual yield (Y A ) for the 22 calibration plots and the predicted yield (Y E ) for the 22 validation plots are shown in Figure 4-18. Yield calculated based on images is shown in Figure 4-19.

PAGE 71

60 Figure 4-17. Performance of yield prediction model (fruits/m2).

PAGE 72

61 Figure 4-18. Yield mapping for citrus fruits (fruits/m2).

PAGE 73

62 Figure 4-19. Yield based on number of citrus fruits in an image (fruits/m 2 ).

PAGE 74

63 4.7 Yield Variability based on Rootstock Variety Fruits/plot based on rootstock variety were subjected to SAS ANOVA PROC. The results showed that yield from Cleopatra mandarin and Swingle citrumelo were grouped together and were sufficiently higher than Carrizo citrange variety. The mean fruits/plot details for the three different rootstock variety is shown in Figure 4-20. Table 4-11. Fruits/plot for 48 plots grouped using means test. Tukey Grouping Mean number of fruits/plot Number of plots Citrus Rootstock A 1608.5 16 Cleopatra mandarin A 1649.8 16 Swingle citrumelo B 1036.5 16 Carrizo citrange 020040060080010001200140016001800Cleopatra mandarinSwingle citrumeloCarrizo citrangeRootstock VarietyMean number of fruits/plot Figure 4-20. Yield variability based on rootstock variety.

PAGE 75

CHAPTER 5 SUMMARY AND CONCLUSIONS This work presented a machine vision algorithm to identify and count the number of citrus fruits in an image and finally to estimate the yield of citrus fruits in a tree. Within this section, a summary of the work and the conclusions drawn from it are presented. Improvements and modifications to the existing experiment are suggested. 5.1 Summary and Conclusions The fruit counting algorithm developed in this research verified the feasibility of developing a real-time machine vision system to estimate citrus yield on-the-go. The algorithm consisted of image acquisition, binarization of color image in hue-saturation color plane, pre-processing to remove noise and to fill gaps, and, finally, counting the number of fruits. Pixel values for three classes (citrus, leaf, and background) was manually taken from 25 calibration images and used for binarizing the color image in hue and saturation color plane to separate the citrus fruits from the background. It was found that the fruit, leaf, and background portion of the citrus grove scene was clearly classified using thresholds in hue-saturation color plane. A threshold in fruit area was applied over the binarized image to remove noise and a combination of dilation and erosion to enhance the image for proper identification of the citrus fruits. Blob analysis was used to identify the citrus fruits and the total number of blobs gave the number of citrus fruits in an image. A cluster of fruits was identified partially using the average area of a fruit and counted as two fruits instead of one in the algorithm. The total time for processing an image was 119.7 ms without image 64

PAGE 76

65 acquisition time. The algorithm was tested on 329 validation images and the R 2 value between the number of fruits counted by the machine vision algorithm and the average number of fruits counted by human observers was 0.79. The variation in the number of fruits correctly classified was partially due to clusters of citrus fruits, uneven lightning and occlusion. Images belonging to a same plot were grouped together and the data from 22 plots were used to predict fruit/plot with the following three variables: 1) Number of fruit estimated using fruit counting algorithm (NP fruits ), 2) Number of citrus pixels/plot estimated using fruit counting algorithm (NP pixels ) 3) Number of fruits/plot estimated using citrus pixels/plot data (NP fruits-pixels ). Yield prediction model was developed using NP fruits-pixels variable. The model was applied over 22 plots and the R 2 value between the yield predicted by the model and the actual harvested yield was 0.46. The main cause for low R 2 was due to the fact that using a single camera, it was not possible to cover the entire citrus tree. Further, fruits that were inside the canopy would have been completely occluded by leaves in the images. Hence, the fruit counting algorithm was not able to identify these occluded fruits. The results indicate that the yield prediction model could be enhanced by using multiple cameras for covering the majority of tree canopy. 5.2 Future Work Highly non-uniform illumination in an image presented a problem for color vision based segmentation approach. One improvement to the present system would be to improve the imaging of natural outdoor scenes with wide variation in illumination. Automatic brightness control before imaging could be implemented by using a phtotransistor to measure the intensity of the imaging scene and sending control to the

PAGE 77

66 camera to change its shutter speed/brightness level. Study should be conducted to determine whether night time imaging with the machine vision system using artifical lighting improves the image acquisition module. Two other areas for future work deal with the problems of clustered fruits and with fruits that are partially occluded from view. Statistical estimates should be developed to account for occluded fruits and fruit clusters in order to classify all the fruits in an image correctly. Another improvement would be to use multiple cameras to capture the entire portion of citrus trees. This would increase the correlation between the number of fruits on a tree and the number of fruits estimated by the yield prediction model. An ultrasonic/laser sensor could be used to measure the distance between the camera and the imaging scene to measure the size of the fruits.

PAGE 78

LIST OF REFERENCES Brown, G. K. 2002. Mechanical harvesting systems for the Florida citrus juice industry. ASAE Paper No. 021108. St. Joseph, MI.: ASAE. Campbell, R. H., S. L. Rawlins, and S. Han. 1994. Monitoring methods for potato yield mapping. ASAE Paper No. 94-1584. St. Joseph, MI.: ASAE. Casasent, D., A. Talukder, W. Cox, H. Chang, and D. Weber. 1996. Detection and segmentation of multiple touching product inspection items. In Proceedings of the Society for Photo-Optical Instrumentation Engineers, volume 2907, pages 205-215, 1996. Coppock, G. E., and J. R. Donhaiser. 1981. Conical scan air shaker for removing citrus fruit. Transactions of the ASAE 24 (6): 1456-1458. Florida Agriculture Statistic Services. 2001. Citrus Summary 2000-01. Florida Department of Agriculture and Consumer Services. Tallahassee, Florida. Florida Agriculture Statistic Services. 2002. Commercial Citrus Inventory2002. Florida Department of Agriculture and Consumer Services. Tallahassee, Florida. Gonzalez, R.C. and R.E. Woods. 1992. Digital Image Processing. Reading, MA: Addison-Wesley Publishing Company. Harrell, R. C., P. D. Adsit, T. A. Pool, and R. Hoffman. 1990. The Florida robotic grove-lab. Transactions of the ASAE 33: 391-399. Hodges, A., E. Philippakos, D. Mulkey, T. Spreen, and R. Muraro. 2001. Economic impact of Floridas citrus industry, 1999-2000. Economic Information Report 01-2. Juste, F., C. Gracia, E. Molto, R.Ibanez, and S.Castillo. 1988. Fruit bearing zones and physical properties of citrus for mechanical harvesting. Proceedings of the International Society of Citriculture 4: 1801-1809. Lee, W. S., T. F. Burks, and J. K. Schueller. 2002. Silage yield monitoring system. ASAE Paper No. 021165. St. Joseph, MI.: ASAE. Nassua, K. 1980. The causes of color. Scientific American 243(4): 124-153. Parrish, E., and A.K. Goksel. 1977. Pictorial pattern recognition applied to fruit harvesting. Transactions of the ASAE 20: 822-827. 67

PAGE 79

68 Pelletier, G., and S. K. Upadhyaya. 1999. Development of a tomato load/yield monitor. Computers and Electronics in Agriculture 23(2): 103-118. Pollock S. L., B. Lin, and J. Allshouse. 2003. Characteristics of U.S. orange consumption. United States Department of Agriculture, FTS 305-01. Roades, J. P., A. D. Beck, and S. W. Searcy. 2000. Cotton yield mapping: Texas experience in 1999. In Proceedings of the Beltwide Cotton Conference, 404-407 San Antonia, TX. Jan 4-8, 2000. Memphis, TN: National Cotton Council of America. Salehi, F., J. D. Whitney, W. M. Miller, T. A. Wheaton, and G. Drouillard. 2000. An automatic triggering system for a citrus yield monitor. ASAE Paper No. 001130. St. Joseph, MI.:ASAE. Schertz, C. E., and G. K. Brown. 1966. Determining fruit-bearing zones in citrus. Transactions of the ASAE 9: 366-368. Schertz, C. E., and G. K. Brown. 1968. Basic considerations in mechanizing citrus harvest. Transactions of the ASAE 11(2): 343-348. Schueller, J. K., and Y. H. Bae. 1987. Spatially attributed automatic combine data acquisition.Computers and Electronics in Agriculture 2: 119-127. Schueller, J. K., J. D. Whitney, T. A. Wheaton, W. M. Miller, and A. E. Turner. 1999. Low-cost automatic yield mapping in hand-harvested citrus. Computers and Electronics in Agriculture 23(2): 145-154. Searcy, S. W., J. K. Schueller, Y. H. Bae, S. C. Borgelt, and B. A. Stout. 1989. Mapping of spatially-variable yield during grain combining. Transactions of the ASAE 32(3): 826-829. Shatadal, P., D. S. Jayas, and N. R. Bulley. 1995. Digital image analysis for software separation and classification of touching grains: I. Disconnect algorithm. Transactions of the ASAE 38(2): 635-643. Slaughter, D. C., and R. Harrell. 1987. Color vision in robotic fruit harvesting. Transactions of the ASAE 30(4): 1144-1148. Slaughter, D. C., and R. Harrell, 1989. Discriminating fruit for robotic harvest using color in natural outdoor scenes. Transactions of the ASAE 32(2): 757-763. Whitney, J. D., and H. R. Sumner. 1977. Mechanical removal of fruit from citrus trees. Proc. Int. Soc. Citriculture 2: 407-412. Whitney, J. D., T. A. Wheaton, W. M. Miller, and M. Salyani. 1998. Sitespecific yield mapping for Florida citrus. Proc. Fla. State Hortic. Soc. 111: 148.

PAGE 80

69 Whitney, J. D., Q. Ling, T. A. Wheaton, and W. M. Miller. 1999. A DGPS yield monitoring system for Florida citrus. Applied Engineering in Agriculture 17(2): 115. Whittaker, A. D., G. E. Miles, O. R. Mitchell, and L. D. Gaultney. 1987. Fruit location in a partially occluded image. Transactions of the ASAE 30(3): 591-597. Wilkerson, J. B., J. S. Kirby, W. E. Hart, and A. R. Womac. 1994. Real-time cotton flow sensor. ASAE Paper No. 94-1054. St. Joseph, MI.: ASAE.

PAGE 81

BIOGRAPHICAL SKETCH The author was born in 1980 in Chennai, India. He graduated with a Bachelor of Science in electronics and communication engineering degree in May 2001 from the Government College of Technology, Coimbatore, India. He then obtained Master of Science degrees in electrical and computer engineering in December 2003 from the University of Florida. 70


Permanent Link: http://ufdc.ufl.edu/UFE0006901/00001

Material Information

Title: Citrus Yield Mapping System Using Machine Vision
Physical Description: Mixed Material
Copyright Date: 2008

Record Information

Source Institution: University of Florida
Holding Location: University of Florida
Rights Management: All rights reserved by the source institution and holding location.
System ID: UFE0006901:00001

Permanent Link: http://ufdc.ufl.edu/UFE0006901/00001

Material Information

Title: Citrus Yield Mapping System Using Machine Vision
Physical Description: Mixed Material
Copyright Date: 2008

Record Information

Source Institution: University of Florida
Holding Location: University of Florida
Rights Management: All rights reserved by the source institution and holding location.
System ID: UFE0006901:00001


This item has the following downloads:


Full Text











CITRUS YIELD MAPPING SYSTEM USING MACHINE VISION


By

PALANIAPPAN ANNAMALAI

















A THESIS PRESENTED TO THE GRADUATE SCHOOL
OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT
OF THE REQUIREMENTS FOR THE DEGREE OF
MASTER OF SCIENCE

UNIVERSITY OF FLORIDA


2004
































Copyright 2004

by

Palaniappan Annamalai


































I would like to dedicate this work to my parents for whom my education meant
everything.
















ACKNOWLEDGMENTS

The process of earning my master' s has been a lengthy process, in which many

people have offered invaluable assistance. I would like to thank my advisor Dr. Won Suk

'Daniel' Lee, who instructed me and always supported me. I am privileged to have

collaborated with him on this proj ect. His advice and ideas have been the pillars that have

held this work to a higher standard. My appreciation goes to committee members Dr.

Thomas F. Burks and Dr. James J. Ferguson for encouragement and guidance to complete

this work. I also wish to thank Dr. Kelly T. Morgan and Mr. Jae Hyuk Choi for their

assistance during field operations. Thanks go to my parents and family members for their

continuous prayers, concern and support. Finally, I would like to thank all of those not

explicitly mentioned here who have aided my intellectual and social growth throughout

my academic career.




















TABLE OF CONTENTS


page


ACKNOWLEDGMENT S .............. .................... iv


LI ST OF T ABLE S ................. ................. vii...___.....


LIST OF FIGURES ........._.._ ........... ..............viii...


AB S TRAC T ......_ ................. ............_........x


CHAPTER


1 INTRODUCTION ................. ...............1.......... ......


2 LITERATURE REVIEW .............. ...............4.....


2.1 Citrus Production in Florida .............. ...............4.....
2.2 Citrus Harvesting ................... ... .......... ...............6.......
2.3 Precision Agriculture and Yield Mapping ................. .............. ......... .....7
2.4 Citrus Yield Mapping .............. ...............9.....
2.5 Image Segmentation ................. ...............11.......... ....


3 MATERIALS AND METHODS .............. ...............15....


3.1 Overview of Experiment............... ..............1
3.2 System Hardware............... ...............18
3.2.1 Color Vision Hardware............... ...............18
3.2.2 DGPS Receiver and Encoder. .....__.....___ .........._ ...........1
3.3 Image Acquisition................... ........... .......2
3.4 Image Analysis using HSI Color Model ......____ ..... ... .__ .. ......._......2
3.5 Development of the Fruit Counting Algorithm ............_......__ ..............22
3.6 Image Processing Time............... ...............25..
3.7 Experimental Procedure................ ...............2
3.7 Prediction of number of fruits/plot ................. ...._._ ...._._ ..........2
3.8 Yield Prediction Model ................. .... ......... ...............29.....
3.9 Performance of the Fruit Counting Algorithm .......... ................ ...............30
3.10 Performance of the Yield Prediction Model ................. ......... ................3 1
3.11 Yield Variability based on Rootstock Variety ................. .........................31












4 RE SULT S AND DI SCU SSION ............... ..............3


4.1 Binarization............... ..............3

4.2 Preprocessing .................. ............ ...............40. ....
4.3 Execution time for the Algorithm ......__....._.__._ ......._._. ..........4
4.4 Encoder Calibration ........._._............._ __ ...............49....
4.5 Prediction of Number of Fruits/Plot .....__.....___ .........._ ...........5
4.6 Yield Prediction Model ........._._.......... ... .__. ... ...............55.

4.7 Yield Variability based on Rootstock Variety............... ...............63


5 SUMMARY AND CONCLUSIONS ......__....._.__._ ......._._. ............6


5.1 Summary and Conclusions .............. ...............64....
5.2 Future Work............... ...............65..


LIST OF REFERENCES ........._.__....... .__. ...............67...


BIOGRAPHICAL SKETCH .............. ...............70....


















LIST OF TABLES


Table pg

4-1. Number of images used in calibration and validation data sets. ............. ..... .........._.33

4-2. Pixel distribution for citrus, leaf and background classes for 25 images in HSI color
plane. ............. ...............37.....

4-3. Threshold for identification of fruit clusters ................. ...............43..............

4-4. Performance comparison for 329 validation images. ............. .....................4

4-5. Execution time for each image processing step ................. ................ ......... .49

4-6. Encoder Calibration. The number of pulses are shown for different trials. ...............50

4-7. Actual harvested yield data ................. ...............51........... ...

4-8. Number of plots in calibration and validation data sets to develop prediction models.53

4-9. Performance comparison of the yield prediction model for 22 plots. ................... .....57

4-10. Yield category for 22 plots. ............. ...............58.....

4-11. Fruits/plot for 48 plots grouped using means test ................. .........................63

















LIST OF FIGURES


Figure pg

3-1. Mapping of trees in the citrus grove. Two trees designated for hand harvesting are
shown inside the red rectangular box for each plot. .....__.___ .... ... ._._ ............16

3-2. Experimental setup. ............. ...............17.....

3-3. Components of the imaging board. .....___.....__.___ .......____ ..........1

3-4. Encoder attached to the wheel of the metal frame. .....__.___ ........___ ...............20

3-5. Image processing steps of the fruit counting algorithm. ............. .....................2

3-6. Schematic diagram of the overall citrus yield mapping system. ............... ...............26

3-7. Algorithm for field-testing of the machine vision system ................. ................ ...27

4-1. Histogram of 25 calibration images in hue plane (* indicates 3898 pixels of a pixel
value of 0 in citrus class,+ indicates 5606 pixels of a pixel value of 0 in background
class). ........._..._... ...............34.._.._ .. .....

4-2. Histogram of 25 calibration images in saturation plane (* indicates 3872 pixels of a
pixel value of 0 in citrus class,+ indicates 5604 pixels of a pixel value of 0 in
background class, # indicates 2736 pixels of a pixel value of 254 in citrus class). .34

4-3. Histogram of 25 calibration images in luminance plane (* indicates 3356 pixels of a
pixel value of 252 in citrus class) .................._.._.. ...............35...

4-4. Histogram of 25 calibration images in red plane. ................. ...._.. ................3 5

4-5. Histogram of 25 calibration images in green plane (* indicates 3330 pixels of a pixel
value of 252 in citrus class). .............. ...............36 ......___....

4-6. Histogram of 25 calibration images in blue plane (* indicates 33 19 pixels of a pixel
value of 252 in citrus class). .............. ...............36 ......_ __....

4-7. Pixel distribution in 25 calibration images in hue-saturation plane. ..........................39

4-8. Pixel distribution in 25 calibration images in hue-luminance plane. .........................39

4-9. Pixel distribution in 25 calibration images in red-green plane ................. ...............40










4-10. Image processing steps of a typical citrus grove scene. ............. .....................4

4-1 1. Regression analysis between the number of fruits counted by human observation
and the number of fruits counted by the fruit counting algorithm. ................... .......48

4-12. Regression analysis for encoder calibration. ............. ...............50.....

4-13. Regression analysis between NA and NPf,,ts.................. ...............54.............

4-14. Regression analysis between NA and NPpxezs. .............. ...............54....

4-15. Regression analysis between NA and NPfrurts-plxels. ............. ...............55

4-16. Regression analysis between yield prediction model and the actual harvested yield.58

4-17. Performance of yield prediction model (fruits/m2). ................ ......... ..............60

4-18. Yield mapping for citrus fruits (fruits/m2) ................. ...............61.............

4-19. Yield based on number of citrus fruits in an image (fruits/m2) ............... ...............62

4-20. Yield variability based on rootstock variety ................. ............... ......... ...63
















Abstract of Thesis Presented to the Graduate School
of the University of Florida in Partial Fulfillment of the
Requirements for the Degree of Master of Science

CITRUS YIELD MAPPING SYSTEM USING MACHINE VISION

By

Palaniappan Annamalai

August 2004

Chair: Won Suk "Daniel" Lee
Maj or Department: Agricultural and Biological Engineering

A machine vision system utilizing color vision was investigated as a means to

identify citrus fruits and to estimate yield information of the citrus grove in real-time. The

yield mapping system was calibrated and tested in a commercial citrus grove. Results

were compared for the yield estimated through the system and that carried out by hand

harvesting.

This study focused on three major issues:

1. Development of a hardware system consisting of a color CCD camera, an imaging
board, an encoder and a DGPS receiver;
2. Development of an algorithm to take non-overlapping images of the citrus grove
and an image-processing algorithm to identify and count the number of citrus fruits
from an image;
3. Development of a yield estimation model that will predict the number of fruits per
tree based on the images of the tree.

Images were acquired for 98 citrus trees in a grove located near Orlando, Florida.

The trees were distributed over 48 plots evenly. Images were taken in stationary mode

using a machine vision system consisting of a color analog camera, a DGPS receiver, and

an encoder. Non-overlapping images of the citrus trees were taken by determining the









field of view of the camera and using an encoder to measure the traveled distance to

locate the next position for acquiring an image. The encoder was calibrated in the grove

before the field-testing of the system.

Images of the citrus trees were analyzed and a histogram and pixel distribution of

various classes (citrus fruit, leaf, and background) were developed in RGB and HSI color

space. The threshold of segmentation of the images to recognize citrus fruits was

estimated from the pixel distribution of the HSI color plane. A computer vision algorithm

to enhance and extract information from the images was developed. Preprocessing steps

for removing noise and identifying properly the number of citrus fruits were carried out

using a threshold and a combination of erosion and dilation. The total time for processing

an image was 119.5 ms, excluding image acquisition time. The image processing

algorithm was tested on 329 validation images and the R2 value between the number of

fruits counted by the fruit counting algorithm and the average number of fruits counted

manually was 0.79.

Images belonging to the same plot were grouped together and the number of fruits

estimated by the fruit counting algorithm was summed up to give the number of

fruits/plot estimates. Leaving out outliers and incomplete data, the remaining 44 plots

were divided into calibration and validation data sets and a model was developed for

citrus yield using the calibration data set. The R2 value between the number of fruits/plot

counted by the yield prediction model and the number of fruits/plot identified by hand

harvesting for the validation data set was 0.46. Although this study was conducted for

citrus fruits, the concept could be easily applied with little modifications to estimate yield

of most fruits that differ in color from the foliage.















CHAPTER 1
INTTRODUCTION

Competitive farmers strive to increase crop yields while minimizing costs. With the

advent of mechanization of agriculture and a trend towards larger equipment, farmers

were able to cultivate very large areas, but many continued to treat their larger fields as a

single management unit, thus ignoring variability found within a specific field. Precision

farming, sometimes called site-specific farming, is an emerging technology that allows

farmers to reduce costs through efficient and effective application of crop inputs for

within-field variability in characteristics like soil fertility and weed populations.

One of the maj or agricultural products in Florida is citrus. Florida' s citrus industry

produced about 12.4 million metric tons of citrus in the 2000-2001 season, accounting

for 76 percent of all citrus produced in the U. S. (Florida Agricultural Statistics Service,

2001). Citrus fruits, including oranges, grapefruit, tangelos, tangerines, limes, and other

specialty fruits, are the state's largest agricultural commodities.

Currently citrus groves are managed as blocks and the variability found within this

unit is not generally considered for grove management. Citrus trees with sufficient water

and nutrients grow stronger, better tolerate pests and stresses, yield more consistently,

and produce good quality fruit than trees that receive excessive/deficient irrigation or

fertilization. To a citrus grower who deals with thousands of trees in many blocks, site-

specific management or precision agriculture provides the ability to apply technology and

to manage inputs as closely as required within a given area. This management of field

variability could improve fruit yield, quality, and income and limit negative impacts on









sensitive environments. Among precision technologies, yield mapping is the first step to

develop a site-specific crop management. Yield with other associated field characteristics

would help growers manage their groves better, evaluate the entire citrus grove

graphically and thus prepare them to make efficient decisions.

Currently, a commercial citrus yield mapping system, named as Goat (GeoFocus,

LLC, Gainesville, FL), is the only available citrus yield mapping system. This system is

attached to a "goat" truck used primarily in citrus harvesting operations. In this system,

the goat truck operator is required to push a button to record the location of every tub,

which may be forgotten and often becomes a maj or source of error. Yield information is

available only after the fruits are harvested and this system gathers yield data from

multiple trees rather than from each individual tree.

The overall goal of this research is to develop a real-time yield mapping system

using machine vision and to provide yield of a grove on-the-go when mounted on a truck

and driven in-between the rows. The system will identify citrus fruits from images using

color information in real-time. The system will estimate citrus yield for a single tree

while citrus yields are currently determined based on whole block or grove. More

specifically, the obj ectives in this research are to:

Develop a hardware system consisting of a color CCD camera, an imaging
board, an encoder and a DGPS receiver,
Develop an algorithm to take non-overlapping images of the citrus grove
and an image-processing algorithm to identify and count the number of
citrus fruits from an image, and
Develop a yield estimation model that will predict the number of fruits per
tree based on the images of the tree.
A main advantage of the proposed system is that it would provide single-tree yield

and could estimate citrus yield before the actual harvesting schedule. Estimated yield

information could then be used for deciding various citrus management practices such as






3


the amount of irrigation, application of herbicide to the plants and finally for scheduling

grove equipment and pickers.















CHAPTER 2
LITERATURE REVIEW

This chapter begins with a brief introduction about citrus production in Florida,

precision agriculture in general and the importance of yield mapping for site-specific crop

management. A review of common citrus harvesting procedures was included to evaluate

current harvesting trends in citrus industry. Current citrus yield mapping systems were

analyzed to identify the desired features for the design of a better yield mapping system.

Finally, research involving image segmentation and various image processing techniques

were reviewed to form a basis for a color vision system.

2.1 Citrus Production in Florida

Citrus trees are small evergreen trees that are grown in tropical and subtropical

climates. As this perennial crop grows in regions with warmer climates (typically

between 12.78-37.780 C), citrus trees are usually cultivated in regions situated at latitudes

between 400 north and 400 south. Citrus cultivation requires periodical fertilization (3-4

times per year) and irrigation (every 7 to 28 days depending on season and soil type), as

well as pruning (depending on tree planting density). Citrus trees usually start producing

fruits in 3-5 years from planting although economic yields start from the fifth year and

the trees may take 8-10 years to achieve full productivity. Fruits could be harvested 5 6

months after flowering, depending on the variety and the environment. Citrus trees are

usually planted in long parallel rows in a grove and usually they are managed as a block

instead of individual trees. A "block" is usually considered to be either a unit of land area

separated by ditches or roads from adj acent planted area or a group of trees of the same









variety and original rootstock. There is no specific size for a block. They could range

from 5 acres to 500 acres.

The orange is a favorite fruit among Americans. It has consistently ranked as the

third most consumed fresh fruit behind bananas and apples and as a juice, it ranks number

one (Pollack et al., 2003). According to the USDA National Agricultural Statistics

Service, in 2002 citrus was grown on over 322,658 ha in Florida. Of that total area, 81.4

percent was committed for orange production, followed by grapefruit production (13.2

percent) and lastly, specialty fruit (e.g. tangerines, tangelos, lemons, limes, etc.) made up

the final 5.4 percent (Florida Agricultural Statistics Service, 2002). Many researchers

have published data about citrus producing regions in Florida. Hodges et al. (2001) says

that:

Citrus has been produced commercially in Florida since the mid-1800s. It is
produced across the southern two-thirds of the Florida peninsula, where there is a
low probability of damaging winter freeze events, from Putnam County in the north
to Miami-Dade County in the south. The four maj or citrus producing regions are
the East Coast, Lower Interior, Upper Interior, and West Coast Districts. In 1957
citrus production was centered in the Upper Interior District, with 40 percent of
total citrus production, followed by the Lower Interior (30%), West Coast (17%),
and the East Coast (12%). By 1999, the geographical distribution had shifted
towards the Lower Interior District (61%), followed by the East Coast (24%), West
Coast (8%), and the Upper Interior (6%). The southward migration of citrus
production was a response to a series of freezes in the north central region of the
state in the 1980s. (Hodges et al., 2001, pg. 2)

Citrus fruits are not climateric and do not ripen further once they are detached from

the tree. Hence care should be taken to harvest the fruits at the right stage of maturity.

Citrus quality is measured using the following standard: Brix, sugar/acid ratio and

percent juice. Once harvested, the fruit has to be graded, sorted, and washed before being

packed for the fresh market or fruit processing plant.









Forecasting citrus production for Florida is an annual task that helps in planning

operations, marketing, and policy making, which are especially important to a crop

harvested over several months and sold year round. The U.S. Department of Agriculture

first made forecasts of Florida citrus production in 1918, based on surveyed opinions of

crop observers and statisticians. The current system is based on obj ective data including

an early-season, limb-count survey to establish actual fruit set, supplemented with

monthly in-season measurements of fruit size and observations of fruit droppage. The

forecast is based on estimates and projections from actual counts and measurements,

avoiding observations based on opinion. The essential features used in the forecast are (1)

number of bearing trees, (2) number of fruit per tree, (3) fruit size, and (4) fruit loss from

droppage.

2.2 Citrus Harvesting

Citrus was harvested in 322,657 ha in Florida during 2002 (Florida Agricultural

Statistics Service, 2002). About 95% of harvested fruit was processed into juice and the

remaining was used in fresh market. Citrus harvesting is a labor-intensive process

involving large number of workers depending upon the size of the grove. Although

extensive research has been conducted to automate citrus harvesting operations using

robots and other mechanical harvesting techniques (Whitney et al., 1998; Whitney et al.,

1999), approximately 99% of citrus crops in Florida are harvested manually.

Custom manual harvesting is carried out with crews of 30 to 50 persons working

with a ladder and picking bags. Fruits are gathered from trees and placed into containers

(bins or tubs) that usually have a capacity of approximately 0.7 m3. The containers are

moved from the grove and finally loading into trucks. Due to a decrease in supply of









workers and an increase in labor cost, alternate methods are assessed by growers to

compete in the international citrus market.

Mechanization of citrus fruit harvesting is expected to improve the efficiency of

harvesting schedules and to minimize the dependence on labor requirements on the

future. Whitney and Sumner (1977) developed shakers for removing citrus fruits from the

tree. Coppock and Donhaiser (1981) developed a conical scan air shaker for removing

citrus fruits. The shaker along with application of an abscission chemical removed fruits

from trees at a rate of 170 trees/hr with an average removal efficiency of 97 percent.

Brown (2002) described various aspects of eight mechanical harvesting systems for

citrus. The main disadvantage of these methods is the damages caused to the fruit due to

bruising when falling from the tree. Sometimes these bruises are too severe for fruits that

are intended for processing.

Schertz and Brown (1968) reviewed the basic principles of citrus harvesting

systems utilizing robots to pick fruits from trees. Harrell et al. (1990) developed a mobile

robotic grove lab to study the use of robotic technology for picking oranges under natural

grove conditions. The picking robot consisted of a simple three degree of freedom

manipulator actuated with servo-hydraulic drives. Fruit detection and positional

information was accomplished with a color CCD video camera and an ultrasonic sensor.

2.3 Precision Agriculture and Yield Mapping

Reducing input costs, minimizing work time, increasing yield, and improving crop

quality to boost profit margin manyfold are the basic goals for any agricultural firm to

compete in domestic and global markets. With high levels of mechanization in crop

production, crops are managed based on units such as blocks rather than individual units.

Precision agriculture is a management philosophy that responds to spatial variability









found on agricultural landscapes. Steps involving precision agriculture include

determining yield variability in a field, determining its cause, deciding on possible

solutions based on economic justification, implementing new techniques and repeating

the procedure in a cyclic approach. Precision agriculture techniques could be used to

improve economic and environmental sustainability in crop production.

Global positioning system (GPS), geographical information system (GIS), remote

sensing (RS), variable rate technology (VRT), yield mapping, and advances in sensor and

information technology have enabled the farmer to visualize the entire field in a way, that

could help manage the agricultural operations efficiently and improve overall

productivity. With precision agriculture technologies, the farmer could effectively

manage the crop throughout its life cycle, starting from preparing soil, sowing seeds,

applying fertilizers/pesticides and finally estimating yield during harvesting based on

each individual plant, thus reducing the waste of resources due to in-field variability.

Among precision agriculture technologies, yield mapping is the first step to

implement site-specific crop management on a specific field. A yield mapping system

measures and records the amount of crop harvested at any point in the field along with

the position of the harvesting system. This collected data could be used to produce a yield

map using mapping software. Yield maps are useful resources to identify variabilities

within a field. Variability in an agricultural field is due to man-made or natural sources.

A natural variability may be due to seasonal change in weather pattern or rainfall over

several years. Examples of man-made variabilities include improper distribution of

irrigation/drainage facilities for field and excessive/deficit application of farm inputs.









Numerous yield monitoring and yield mapping systems have been widely

researched and commercialized for various crops over the last one and a half decades.

Yield mapping during grain harvesting (Schueller and Bae, 1987; Searcy et al., 1989) has

been extensively studied and adopted. Examples of yield mapping for other crops include

cotton (Wilkerson et al., 1994; Roades et al., 2000), potatoes (Campbell et al., 1994),

tomatoes (Pelletier and Upadhyaya, 1999), and silage (Lee et al., 2002). Being able to

evaluate the entire farm graphically, in an encapsulated picture, with respect to yield and

other associated field characteristics would tremendously help farmers to more intimately

know their field and thus help them to make important decisions in an efficient manner.

2.4 Citrus Yield Mapping

The preliminary on-tree value of all citrus for the 2000-01 season in Florida was

$760 million. In spite of the widespread economic importance of the citrus industry,

currently the Goat system is the only commercial yield mapping system for citrus. Citrus

yield monitoring systems have been under development for several years. The first yield

monitor for citrus was developed by Whitney et al. (1998 and 1999) and Schueller et al.

(1999). In the Goat yield mapping system, yield is measured by mapping the location of a

tub as it is picked by a truck. Citrus harvest is a busy operation and therefore one of the

prime goals of this system is to develop a yield measurement and mapping system that

would not interfere with any of the current harvesting procedures. One advantage of this

system is that there is no need for any change in the harvesting practice involving many

field workers who are often relatively untrained in managing sophisticated equipment.

A computer is used to coordinate all the operations of the Goat yield mapping

system. It has a crop harvest tracking system (GeoFocus, LLC, Gainesville, FL) that

records a container location (latitude, longitude, and time) whenever the operator pushes










a button for recording. It also has an LCD display for easy operation, some other control

parts for interfacing all the components and a data transfer system to transfer the

collected data to a computer for further analysis. Data would be later retrieved from the

unit and post processing is carried out to produce yield maps.

The Goat system had the button and the LCD display mounted on the dash of the

goat truck so that it is easily accessed by the driver. All the remaining electronics were

placed inside a box and kept in a more secluded place inside the truck. It was noted that

this system sometimes produced incorrect maps due to the fact that sometimes the truck

driver failed to record the location of the tub because of the rush in harvest or other

factors.

To avoid the previously encountered problem, an automatic triggering system

(Salehi et al., 2000) was developed to record the location of the tub. The system consisted

of a Geo-Focus yield monitor, a DGPS receiver, two pressure switches, a position

switch, and two timers. The pressure switches were used to detect a load on the main

boom lift cylinder and on the dump cylinder. The position switch determined whether the

tipping head was located over the truck bulk bin. When all the three conditions were met,

the system identified that the truck was picking a tub for collecting the fruit and the data

gathering circuit was activated for a given time using the timer and relay circuit for

collecting the DGPS data. However, the automatic triggering system didn't record some

tub locations, which could be attributed to the problems related with the delay timer,

pressure switch settings, and hardware connections.

The economic value of the citrus industry in Florida makes precision farming a

viable technology for enormous development. Recognizing citrus fruits on a tree is the









first maj or task of a yield mapping system using a machine vision system. Citrus fruits

are distributed in a strip about one meter deep within the canopy, in a completely

unstructured environment. Detecting the citrus fruit on the tree involves many complex

operations. Automatic visual identification of fruit is further complicated by variation in

lighting conditions from bright sunlight on the outer parts of the canopy to deep shadow

within the canopy. Citrus fruits often grow in clusters and also some of the fruits are

occluded by branches and foliage.

Fruit distribution was studied (Juste et al., 1988) with 'Salustiana' and

'Washington' navel using a system of cylindrical coordinates and it appears that around

80% of fruits were between the outer boundary and at a distance of 1 m 1.4 m from the

outer canopy. But in the case of mandarins most of the fruits were at a distance of 0.75 m

from the outer canopy. Distribution of fruit clusters in citrus trees was studied (Schertz

and Brown, 1966) for six navel orange trees in Tulane County, California. An evolution

of fruit clusters showed that 68 percent of the fruits were borne as separate fruits, 19

percent in clusters of two, and 7 percent in clusters of three. The remaining 6 percent had

four through eleven fruits per cluster.

2.5 Image Segmentation

Some of the earlier studies regarding fruit recognition were conducted for apple,

citrus and tomatoes. Parrish and Goskel (1977) developed the earliest prototype for an

apple harvester and studied the feasibility of harvesting methods based on pictorial

pattern recognition and other artificial intelligence techniques. The prototype used a

standard black-and-white camera, to acquire apple images, and a pattern recognition

method to guide the fruit-harvesting robot. Slaughter and Harrell (1989) used a color









camera to exploit high contrasting colors of orange to implement a color image-

processing algorithm to distinguish orange from a typical image of a citrus grove.

Whittaker et al. (1987) used fruit shape rather than the color information to detect

tomatoes. This method could be used even in the presence of interference caused by

bright reflection and when fruits were shaded. They used circular Hough

transform (CHT) to locate fruit or part of fruit in the image. CHT is a mathematical

transform that uses angle matrix and range of radii to locate circles or part of circles in a

digital image having discrete pixels. Before applying circular Hough transform, the

image was passed through a Sobel gradient operator, which calculated the gradient

magnitude and direction at each pixel point. Using this method, partially occluded fruits

could also be detected.

Slaughter and Harrell (1987; 1989) were involved in the development of a robotic

fruit harvesting system and presented two approaches for detecting the fruit in an image

based on color information. In the first approach (Slaughter and Harrell, 1987), the hue

and saturation components of each pixel were used as features to segment an image by

applying a traditional classification in a bi-dimensional feature space. The segmentation

was carried out using a maximum and minimum threshold for each feature. Since color

segmentation required some form of illumination control, they used an artificial lighting

system. In the second approach (Slaughter and Harrell, 1989), a classification model was

developed for discriminating oranges from the natural background of an orange grove

using only color information. A Bayesian classifier was used in the RGB color space and

fruits were segmented out from the background by checking whether they belonged to the









fruit class or not. A reference table was created for various classes with the Bayesian

classification technique.

Casasent et al. (1996) used X-ray images to detect and segment multiple touching

pistachio nuts placed on a conveyor trays for product inspection. New techniques were

developed for identifying items irrespective of orientation by employing rotation-

invariant filters to locate nuts. Watershed transformation was utilized for segmenting

touching or overlapping nuts. Watershed transform is a region-based segmentation

approach to partition the image into disj oint regions, such that the regions are

homogeneous with respect to some property, such as gray value or texture. Advanced

morphological image processing operations were used to produce gray scale images of

only the nutmeat and to determine the amount of shell area filled by nutmeat.

Shatadal et al. (1995) developed an algorithm to segment connected grain kernels

i n an i mage to as si st i n a m achi ne vi si on-b as ed-grai n-gradi ng exp eri ment. A

mathematical morphology-based algorithm was developed for disconnecting the

connected kernel regions. The algorithm was used on wheat, durum wheat, barley, oats,

and rye. The only limitation of the algorithm was that it failed when the connected

kernels formed a relatively long isthmus between them.

The maj ority of these works used charge coupled device (CCD) cameras to capture

images and to use local or shape-based analysis to detect fruit. A CCD is a silicon chip

whose surface is divided into light-sensitive pixels. When a photon (light particle) hits a

pixel, it registers a tiny electric charge that can be counted. With large pixel arrays and

high sensitivity, CCDs can create high- resolution images under a variety of light

conditions. A CCD camera incorporates a CCD to take such pictures. Systems based on









local analysis such as intensity or color pixel classification allowed for rapid detection

and were able to detect fruits at a specific maturity stage (i.e., fruits with a color different

from the background). Systems based on shape analysis were independent of color, but

their algorithms were more time consuming.















CHAPTER 3
MATERIALS AND METHODS

The obj ective of this chapter is to provide background about the hardware and basic

software components used in the research. The chapter begins with a procedural outline

of the experiment. A general overview of an encoder and a DGPS receiver is presented

next. Then the chapter is divided into a description of the image acquisition section, an

analysis of image in hue, a saturation and intensity (HSI) model, and development of the

fruit-recognition algorithm. The complete control system and software is discussed lastly

to provide background details for software development.

3.1 Overview of Experiment

The yield mapping system was tested in a commercial grove (Conserve II), which

was located near Winter Garden, Florida. The grove consisted of 48 plots and there were

24 trees in each plot, with 'Hamlin' oranges on three rootstocks: 'Cleopatra' mandarin

(C.reticulata), Swingle citrumelo (Citrus paradisi Macf. x Poncirus trifoliata [L] Raf.) and

Carrizo citrange (Citrus sinensis x Poncirus trifoliate). The Cleopatra mandarin rootstock

is known to bear fruits more slowly than other rootstocks. The trees were 17 year old.

Each variety of rootstock was planted in 16 plots. Out of every plot, two trees for yield

measurement were selected for the study. The two trees were named as "Yield trees" and

were of same size. These two trees were always side by side and planted in a single row.

The distribution of plots and the locations of sampling trees are shown in Figure 3-1.





r0000 o0000 p0000014
0000 QQ 0r r0QQQCI
0000 000000 0 a ~ oeo000

POO0C 400DO"4llrl0DDO00000
0~~q000 q000q,0"MOD0 0000
~QB000~~0D0 0000D"ODOO000
2~0~000 ~000DO"DODOCO0000

aa 0000 0aaa000 wccgggg
~~40000 10000"00000 00000
0~~0000 00D-000000gaggag


,0000 0000 secoga






4000Q~~~C C100000 1000~0Cr
0000 0~~~1000 ~~0 0000C




coc coco





0000 "oooDDOD QQQ Uppp
o o0 0 0 o 0000 0 0 0 000( a00000 0

0000 00000 0000 r
00000 "00000 00000
Do~oppo ggggg- ggg m
0000"0000000~00000 m
NC000 0 00qoo00 Quppg a010 0 0 0
"000" 00 000 d000400m
500000000"000000000
o0000 woD000o0000 00000N~

coo qoooo c~~~oocooqcoo

000q~~~QI~000 0000000

0000 0000~~~~0N0000 ~
0000 00000 0000g000~

cooeoqo c~~~~oococoooooq
@00 000 0000 0~~~~0000 ~

cooo ooooeoao oeo go olo
m soo~0~0000=000
0000~ "googg'00000
00000 00maa 000oa 00

S0000--~~0~0 nnann o 000 0(


0 0004 00000I~aa 0000 0 0 000




0000 0000000000000


DD000mDDD0Cb g0000Cae


000000C 00~~mm~g000Ca

DDCDODCO 00000Cam




OD00000000CDOC00000000Cam

0 000400 a000 mm ~I~aaaOh lm r



000 0000C aC Oav a laaaemY
bba0000000 000a Oab be111~



DaaaAloodCooc o oocoac
00000 00DaDOo W

a 00 0 lh 0 0 01 0 000 l

a a OD lY00 0000 0000 0 00aaP~ 0 aa0(= 4
a DDOD o20 a000 0000 000a0OI aa0 4=
30000 00000000000000000( ,





00~~000000100 mo000CNC

000 a 00000 aa ln0 0 00a






Sa a 0 ga naco acoo al

0000 0 aa l-m 000 000 0000C o c






00000 000( 0000 000(<

oco n o ococooooooocoooca a 4'
a R9222ooooo ocoEolbaac, a
bdl,COQ00oDQQQC~uu 40000CD




O aWY a WWe00 as o aa lCC r


00000 000 ] 00000 no 9~

ocoooooocPdooncoc~~o as
DC 000lq0 0 DD~00Cinnan


Figure 3-1. Mapping of trees in the citrus grove. Two trees designated for hand

harvesting are shown inside the red rectangular box for each plot.


3


N



.7




s
a



a
c

A
rr




n

3

B

C
C
I;
E

h:
h
-
e
1E
P
C
E
F




C

g

F


-
I

h

C




re
r
c
.:










A 4x4 truck was used for driving inside the grove. The complete setup, consisting of a

desktop computer, a control box for an encoder and a camera, and a DGPS receiver, was

kept on the rear of the truck, Figure 3-2. A metal frame attached to the rear of the truck

was used to carry a generator, the source of power supply for the entire setup and this

metal frame moved in tandem with the truck. The encoder was attached to the wheel of

the metal frame (Figure 3-4). A camera (model: FCB-EX780S, Sony, New York, NY)

and a DGPS receiver (model: AgGPS 132, Trimble, Sunnyvale, CA) were attached to a

metal pole that was supported to the tailgate of the truck. The metal pole was 5.2 m high

and the camera was 4.9 m above the ground. The camera was placed at 45-degree angle

relative to the ground with the intention to cover a maximum section of the tree canopy.


Figure 3-2. Experimental setup.










3.2 System Hardware

3.2.1 Color Vision Hardware

The CCD camera was used to take pictures in the grove. The camera had a built-in

image stabilizer utility, which minimized the appearances of shaky images due to low

frequency vibrations.


V _IN1-
V _IN2-
V _IN3-


V _IN1


Aux


RX
RS-233
TX
drivers &
/RTS
receivers
ICTS

To DB-44
Connector for PCI


Host 32-bit PCI bus


Figure 3-3. Components of the imaging board.

The camera was powered by an imaging board. Camera features such as imaging mode,

shutter speed, and brightness were adjusted through serial communication. Video signals

from the camera were fed to the computer through a frame grabber (model: Meteor-II,

Matrox, Quebec, Canada) that supports capture from multiple standard analog video










sources and provide real-time image transfer to system. Various components of the

imaging board are shown in Figure 3-3. The imaging board features twelve software

selectable input channels to switch between twelve composite video signals. It accepts an

external trigger output and can operate in next valid frame/field mode. The imaging board

provides an auxiliary power supply unit that could be used to provide power for a camera.

The imaging board features 4 Mbytes of video transfer memory for temporary frame

storage to minimize loss of data during long bus-access latencies found in heavily loaded

computer systems.

3.2.2 DGPS Receiver and Encoder

A Differential Global Positioning System (DGPS) receiver (model: AgGPS 132,

Trimble Inc., Sunnyvale, CA) with a Coast Guard Beacon antenna was used to locate the

position of an image in the grove. The DGPS receiver was configured for a 5 Hz

refreshing rate. An incremental encoder (model: CI20, Stegmann, Dayton, OH) was

attached to the small frame to measure the amount of distance traveled in the grove

(Figure 3-4). The encoder has a resolution of 4048 pulses per revolution. A 12-bit

multifunction PCMCIA I/O board (model: DAQCard-Al-16E-4, National Instruments,

Austin, TX) was used to acquire and count pulses from the encoder.

The encoder was calibrated before the actual experiment in the citrus grove. Pulses

from the encoder were read for known distances three times and the average of those

values were taken to be the encoder output for that particular distance. Two channels

were read from the encoder and the phase between the channels helped to identify

whether the wheel was moving in the forward direction or in reverse direction.































m3. Image Acquisition


Fo evlpngactrsfui ecgiio lortm iae w re ae nsainr
mode '0 usn a nao cmr (oel CBE70S onNe okN) ih 4




seasongur ov.Enoer twoce das urn the lastwee of Deemer 20 eandth first eeko







January, 2004. The images were taken in natural outdoor lighting condition. Brightness

and shutter speed were adjusted for each plot before acquiring images. During the

experiment, shutter speed was varied between 1/1000 to 1/15 sec. Higher shutter speeds

were required during bright daylight condition and lower shutter speeds were useful

during late afternoon to obtain good images with approximately unvarying brightness.










Images were aligned over the trees by aligning the first image with a yield tree

using a flagpole. The truck was moved back and forth so that the camera field of view

was aligned with the flagpole. Subsequent non-overlapping images were obtained using

the encoder. The encoder was calibrated and programmed to prompt the user on reaching

the next location for taking non-overlapping image. At this location the truck was stopped

for a brief period and an image was taken. Images of most of the tree canopy were

acquired by driving around the trees on both sides. Images were processed on a Windows

based system with a 750 1VHz Pentium processor.

3.4 Image Analysis using HSI Color Model

Color is one of the most important properties that humans use to discriminate

between obj ects and to encode functionality. For example, sky is blue, citrus fruit is

orange, and leaf is green. An obj ect' s color comes from the interaction of light waves

with electrons in the obj ect matter (Nassau, 1980). The colors that human beings identify

in an obj ect are based on the nature of the light reflected from the obj ect surface. For

example, a red apple reflects light from wavelengths centered around 700 nm ranges,

while absorbing most of the energy at other wavelengths. An obj ect that reflects light in

the entire visible spectrum equally appears white in color.

The purpose of a color model is to define a standard specification for specifying

color in some generally accepted way. For instance, the red, green, and blue (RGB) color

model is used in hardware applications like PC monitors, cameras and scanners; the cyan,

magenta and yellow (CMY) color model is used in color printers; and the luminance, in-

phase and quadrature (YIQ) model is used in television broadcasts. The most commonly

used color models for image processing are RGB and HSI models. In essence, a color









model is a specification of a 3-D coordinate system and a subspace within the system

where each color is represented by a single point (Gonzalez and Woods, 1992).

The current implemented system is using HSI as the color space. The main reason

behind the selection of HSI model is that it correlates well with human color perception.

Hue is defined as color attribute that describes pure color (pure red or green etc.),

whereas saturation gives a measure of the degree to which light is added with a hue

component. However, as all images from the camera were encoded in composite video

format, it was converted to RGB tri-stimulus format in the imaging board. Then the

image was converted later to HSI equivalent in software for further analysis. A composite

video signal contains all of the brightness, color, and timing information for the picture.

In order for the color information to be combined with the brightness and timing,

information must be encoded. There are three main color-encoding systems in use

throughout the world: National Television System Committee (NTSC), Phase Alternation

by Line (PAL) and Systeme Electronique Couleur Avec Memoire (SECAM). The camera

utilized for this experiment used NTSC format.

3.5 Development of the Fruit Counting Algorithm

The steps in the fruit recognition algorithm are to identify fruits from an image and

process the results to remove noise and to improve precision in counting the number of

fruit. Segmentation or binarization is an image-processing step used to separate obj ects of

interest from background. In this research, the object of interest was a citrus fruit and the

background included citrus leaves and branches. The simplest way to segment an image

is by a gray level threshold or global threshold. This operation requires that the obj ect of

interest and backgrounds have different levels of brightness or completely different

colors. Unfortunately, the fruit portion, the leaf portion, and the background are not easily









differentiated using this method because the gray level histogram or color histograms of

these features are not unimodal. Creating a gray level histogram, which is bimodal, would

make the selection of an optimum threshold easier and this could also be automated. The

selection of threshold for identifying citrus fruits was made using HSI color plane.

Color characteristics of the images were analyzed in RGB (red, green, and blue)

and HSI (hue, saturation, and intensity) color spaces. To develop a system to identify and

count citrus fruit in an image, various obj ects in typical citrus grove scene should be

collected and analyzed. In the later stage when the system was developed, it should be

tested on similar images to verify and compare the performance of the proposed system.

For these reasons, the images were divided into calibration and validation data sets. The

pixels were classified into three classes: C (citrus fruits), L (leaf), and K (background).

The RGB and HSI values of each pixel were obtained using a program written in VC++

(Microsoft Corporation, Redmond) with the Matrox Imaging Library (Matrox Imaging,

Quebec, Canada) for three different classes. The pixel values were stored in separate text

Hiles for different classes and processed using Microsoft Excel (Microsoft Corporation,

Redmond). Pixels in the three classes C, L and K were chosen manually by inspecting

from the images in the calibration image set.

Pixels were plotted for various combinations of color components in the RGB and

HSI color space. Binarization was carried out in the color plane containing a clear

distinction between the fruit and background, resulting in white pixels representing the

fruit and black pixels for all other classes.

The Hield of mathematical morphology contributes a wide range of operators to

image processing that are particularly useful for the analysis of binary images. Common










usages include edge detection, noise removal, image enhancement and image

segmentation. In order to process binary images, the following operations were

performed in this research: erosion, dilation and closing. Erosion shrinks the boundary of

a binary image to be smaller, while dilation expands the boundary to be larger depending

on a structuring kernel. Closing is a morphological operation, which consists of applying

dilation and, immediately, erosion to fi11 in small background holes in images.

Due to the dissimilarity in illumination between the images and the presence of

some dead leaves, certain pixels were falsely classified as fruits. Using the set of

calibration images, immediately after binarization, a threshold was applied based on area

of the selected features to remove false detections. In some of the fruits detected, a few

pixels mostly at the center of the fruit were classified as background due to very high

illumination. The kernel sizes for filling the gaps were determined by applying kernels of

various sizes and of various orders over the calibration images. From this trial, the order

of the erosion and dilation and the optimum kernel size was selected for the algorithm.

Using the set of calibration images, a closing operation with structuring kernel (5 x 5)

was applied to fi11 these gaps. These image-processing steps are shown in Figure 3-5.

Citrus fruits were identified using blob analysis and in this method, connected fruit pixels

were treated as a single fruit. Fruit features such as area was extracted for all fruits and

stored in a text fie for post processing. Fruit area is defined as the number of pixels in a

connected region and compactness value is derived from the perimeter and area for each

blob. Based on the average size of a fruit, a threshold was used to identify and consider

cluster of fruits while determining the total number of fruits in an image. With these









features, the size of the fruit could be calculated, using a sensor to measure the distance

between the camera and the tree, such as an ultrasonic sensor.


Acquire Convert Binarize using Threshold Dilation
image RGB to C4 hue-saturation I Iusing area (5 x 5)
HSI plane


Erosion Extract Estimation Count
(5 x 5) C4 features of I 4 of Fruit number of
fruits clusters citrus fruits


Figure 3-5. Image processing steps of the fruit counting algorithm.


3.6 Image Processing Time

Processing time is a maj or concern in a real-time machine vision application. A 750

1VHz Pentium processor was used to process an image and the processing time of each

image-processing step was measured using the computer clock. Each step was measured

10 times and was averaged from 10 executions. Although images were converted from

composite video signal to RGB model in the imaging board, substantial time was

required for converting the images from RGB to HSI color model. Each image had 640 x

480 pixels and every pixel had to undergo many comparisons based on its relative

position in the hue-sat color plane to be classified to one of the three classes. To reduce

the processing time, every pixel was compared with background category initially since

the number of pixels for background was the maximum in the hue-sat color plane and the

percentage of fruit pixels in an image was less compared to the background. This

optimization considerably reduced the time for processing an image.









3.7 Experimental Procedure

A schematic diagram of the overall experiment setup is shown in Figure 3-6. A DGPS

receiver and camera control interface were connected to the computer through the serial

ports available in the computer. A video signal from the camera was fed into the imaging

board and data from the encoder was fed into the counter input channel of the PCMCIA

I/O board.

C amera




Serial Imaging
Port #1 Board

Computer
Serial PCMCIA
SPort #2 Board vo



DGPS 12V Power Encoder
Receiver Supply


Figure 3-6. Schematic diagram of the overall citrus yield mapping system.


Once the camera was mounted on the top of the pole, the camera field of view was

measured and the width and height of the imaging scene were calculated. Based on the

width of the image scene, the encoder was programmed to prompt the user when the

required distance has been traveled from the current imaging location to take subsequent

non-overlapping image. After aligning the first image with the tree, the truck was driven

very slowly (2.2 m/s) and the pulses from the encoder were read continuously with a 20

ms time interval to measure the distance traveled from the previous imaging location.



























































Figure 3-7. Algorithm for field-testing of the machine vision system.









After the required distance had been traveled, subsequent non-overlapping images

were grabbed along with position information from the DGPS receiver. Immediately the

encoder counter value was reset to zero so that the relative distance from the new

imaging position could be used as reference for the next image. The algorithm continued

until the user terminated it. Imaging sequence in the field trial is shown in Figure 3-7.

Images were taken over two days in the Hield. The height of the camera was adjusted only

once at the beginning of the day and remained at the same position throughout the day.

The camera Hield of view was calculated on both imaging days and the encoder was

programmed to reflect the current Hield of view settings. The experimental setup and

encoder are shown in Figures 3-2 and 3-4.

3.7 Prediction of number of fruits/plot

In the grove where the citrus yield mapping system was tested, two trees in each

plot were designated for hand-harvesting. Fruit/plot were predicted based on three models

using the following three variables:

1) Number of fruit estimated using fruit counting algorithm ( rNsft)

2) Number of citrus pixels/plot estimated using fruit counting algorithm (NP,,xezs)

3) Number of fruits/plot estimated using citrus pixels/plot data ( ~ruits-plxels)

Images belonging to a same plot were grouped together and the number of fruits

estimated by the fruit counting algorithm was summed up to give the number of fruit/plot

estimates, using the following variables: mt,,ts, Nplxels, and N~ruits-plxels-

rmes,, used the number of fruits identified in an image by the machine vision

algorithm while NP,,xez used the number of citrus pixels in an image. N~ruits-plxels USed a

relation between the actual size of the fruit and the size of the fruit in terms of pixels in

an image. Relation between a pixel size and its corresponding actual size in the imaging









scene was calculated. Actual size of an image with 640 x 480 pixels corresponded to an

imaging scene of 1.67 m (5.5 ft) long and 1.26 m (4. 12) ft high. As the average size of a

fruit from each plot was known, area of a fruit in a plot was calculated in terms of pixels.

Then, the number of fruits in a plot was determined from the total number of citrus

pixels/plot using the following estimation, and was used to estimate the number of fruits

in a plot. Since

640 x 480 pixel2 = 1.67 x 1.26 m2 = 2.1 m2 = 2107970 mm2

Thus, 1 mm2 = 0.15 pixel2

Let D be an average diameter of a fruit in a plot, Aactual be an area of fruit in two-

dimensional plane, and Armage be an area of fruit in an image, and Tbe total number of

citrus pixels in a plot, then


Aactal K and

A-mage= 0.\"15atl


T T T
Thus, Nfruits-plxels
mage, O.Sactual (\2


Since T (from the image processing algorithm) and D (from the actual harvesting data)

are known, NPfruits-plxels can be obtained.

3.8 Yield Prediction Model

Citrus yield is calculated as number of citrus fruits per unit area. The distances

between citrus trees in the grove were 3.05 m (10 ft) in-row and 6.1 m (20 ft) between-

rows. For this particular experimental setup, yield is calculated as

NP
YE frults- pixels
3.05 x 6.1l(m2) X 2 (trees)









NA
YA=
3.05 x 6.1l(m2) X 2 (trees)

where YE = Estimated yield by the machine vision system,

YA = Actual yield by hand harvesting.

Yield (Yimage) WaS calculated based on the number of fruits identified from the

image. Each image covered 1.7 m of the canopy lengthwise and the distance from the

camera to the canopy was 3.0 m. Yimage WaS calculated as

NP
Y~mg frurts-prxels
3. 0x 1.7 (m )

where NP = Number of fruits/plot estimated using the machine vision system.

3.9 Performance of the Fruit Counting Algorithm

An algorithm was developed using 25 calibration images and tested on the

remaining 329 validation images. In order to evaluate the performance of the algorithm,

fruits counted by the fruit counting algorithm should have been compared with the actual

number of fruits in the region covered in the image. Since it was very difficult to define

the boundary of each image and count the number of fruits in the grove, the images were

shown to three observers and the average of these three reading were taken as reference

for the fruit counting algorithm. This arrangement was made for manual counting

because there were variations in the total number of fruit perceived by human beings.

ErrorImage(%) WaS defined as percentage error between the number of fruits counted by

the machine vision algorithm and the average number of fruits counted manually.

MV~ M\~C
Errormage (%) = x1000


where









MVy= number of fruits counted by the machine vision algorithm

M~C= average number of fruits counted manually

3.10 Performance of the Yield Prediction Model

Images from each plot were grouped together and the number of fruits from each

plot was compared with the actual number of fruits harvested from the respective plots.

Half of the total plots were used as calibration data to develop a prediction model to

estimate citrus yield. The model was tested on validation data set consisting of remaining

plots. ErrorPlot(%) WAS defined as percentage error between the yield estimated by the

machine vision algorithm and the actual yield by hand harvesting.

Y -Y
Error,,ot(%)= x 100


where

YE = Estimated yield by the machine vision system

YA = Actual yield by hand harvesting

3.11 Yield Variability based on Rootstock Variety

Fruits per plot detail were used to check whether there was any correlation between the

harvested yield and the rootstock variety ('Cleopatra' mandarin, Swingle citrumelo and

Carrizo citrange). Means test was conducted on the yield data for 48 plots grouped by the

three different rootstock varieties using SAS ANOVA PROC.

The following is the SAS Program used for the means test of the yield data of different

rootstocks.

PROC ANOVA;

CLASS CRTS;

MODEL NF=CRTS;









means CRTS / tukey lines;

RUN;

proc cluster DATA=ROOTSTOCK method-average std pseudo noeigen outtree=tree;

ID CRTS;

var NF;

run;

Variables used in the ANOVA PROC:

CRTS: Citrus Rootstock variety

NF: Number of fruits in a plot















CHAPTER 4
RESULTS AND DISCUSSION

This chapter begins with a summary regarding binarization and preprocessing of

images. Then the chapter illustrates various steps involved in the fruit counting algorithm

and compares the performance of the algorithm over 354 images with respect to the

number of fruits counted manually. Processing time for various image-processing steps

are explained and the chapter concludes with a description about the yield prediction

model and compares its results with the actual harvested yield data.

4.1 Binarization

Images were divided into calibration and validation data sets, Table 4-1. Using a

program written with Matrox library in VC++, RGB & HSI components were collected

for features by drawing a rectangle using a mouse. A program was used to collect RGB

and HSI components of the three classes in an image, citrus (C), leaf (L) and background

(K) from the calibration images and each class was stored separately in different text file.

Histograms in all the color components for the three classes are shown in Figure 4-1

through 4-6. There was no distinct separation between citrus class and other classes in

any of the individual color component except hue component.

Table 4-1. Number of images used in calibration and validation data sets.
Calibration Validation Total
25 329 354












1800
0 Citrus
1600 -1 x Leaf
"o aBackground
1400-

1200 -1 &00

2 1000-

g 800 xxx

600- x

400 0 x x





0 50 100 150 200 250 300
Pixel Value

Figure 4-1. Histogram of 25 calibration images in hue plane (* indicates 3898 pixels of a
pixel value of 0 in citrus class,+ indicates 5606 pixels of a pixel value of 0 in background
class).


600
0 Citrus
x Leaf
500 -1 aBackground



400- x


30 x o
x ~ x o a
2300 xx0 0o0
LL ~ox xx oeoo
x xo x



20 5 0 5 0 5 0






in ciru cls)












































I


1200



1000



800


50 100 150 200 250 300
Pixel Value

Histogram of 25 calibration images in luminance plane (* indicates 3356
pixels of a pixel value of 252 in citrus class).


Figure 4-3.


1800

1600

1400

1200

2 1000




600

400


Citrus
x Leaf
Background 0



80

-




dP o






0 50 100 150 200 250 3
Pixel Value

Figure 4-4. Histogram of 25 calibration images in red plane.


200












1000
0 Citrus *
900 -( xLeaf
a Background
800-

700-

600-

500-
0- o

u- 400~ -

300 o







0 50 100 150 200 250 300
Pixel Value

Figure 4-5. Histogram of 25 calibration images in green plane (* indicates 3330 pixels of
a pixel value of 252 in citrus class).


1000
0 Citrus *
900 a xLa

800 -1 aBackground

700 -1 o

600 -1

S500-

LL400~ -

300-o

~o oo
100 0
040


0 50 100 150 200 250 300
Pixel Value

Figure 4-6. Histogram of 25 calibration images in blue plane (* indicates 33 19 pixels of a
pixel value of 252 in citrus class).

As a next step, gray level histograms were plotted in pairs of two color components


and it was found that there existed a clear line of separation between the fruits and the










background in hue-saturation color space (Figure 4-7). Gray level histograms for the

three classes in hue-luminance and red-green color plane are shown in Figure 4-8 and

4-9. Although it seemed there might be a distinction between the fruits and background in

red-green color plane, there were numerous false detections when different thresholds

were tested on the validation images. The main reason for so many false detections was

because of high contrast and brightness level in the image, which tended to make leaves

and background obj ects white in color and were classified as citrus class.

The threshold in hue-saturation color plane was carefully chosen in a conservative

approach after many trials have been conducted over the calibration images. The

luminance component was added to the threshold to make it less dependent on the

brightness level of the image during binarization.

The pixel distribution for various classes in the calibration images is shown in

Table 4-2. Although only 58% of citrus class was captured inside the threshold, the

binarization scheme was found to work very well with the validation images. The main

reason behind this threshold was that the threshold contained 0% of background and

0.03% of leaves. Since maj ority portion of an image consisted of leaves and background,

this binarization scheme performed well and there were some underestimation but very

small number of overestimation due to the conservative binarization scheme.

Table 4-2. Pixel distribution for citrus, leaf and background classes for 25 images in HSI
color plane.
Pixel Category Citrus class Leaf class Background class
Number Percentage Number Percentage Number Percentage
of pixels of pixels of pixels
Inside threshold 15875 58.1% 68 0.03% 0 0%
Outside 11438 41.9% 23347 99.7% 8165 100%
threshold










Mathematical representation of the two thresholds is as follows:

For any pixel,

if (Hue is between 4 43, Saturation is between 50 250 and Luminance is between 60 -

230 )



// Calculate the position of the threshold line at that specific hue value

S = (4.83) *Hue -53.1

//check whether the pixel is within the threshold

if (Saturation > S1 )

Pixel = Citrus //Pixel was inside threshold and marked as 255 //

else


Pixel = Background


//Pixel was outside threshold and marked as 0 //


else


Pixel = Background


//Pixel was outside threshold and marked as 0 //












300
Threshold
a Citrus x Leaf Background

250 ........anx. a. ~~r cLC4 oe e oemouse B omo 0..o

X D x "xx x 0
a 200 x x oo x
o x x o o a
150" xy x a o oo
I 0~ ~ o a no 00

10 .ox o oo ox
o x n o oxxo 0
05 .% o o fi 0"
50" o x oo






3 00 o qp o


250







150



T 100
-1


50


0 50 100 150 200 250 300

Hue (gray level)

Figure 4-8. Pixel distribution in 25 calibration images in hue-luminance plane.












300
o Citrus x Leaf a Background

250 -1


,200


S150 -.

O DO O





50


0 50 100 150 200 250 300
Red (gray level)
Figure 4-9. Pixel distribution in 25 calibration images in red-green plane.


Thresholds shown in Figures 4-7 and 4-8 were used for the binarization step. The

algorithm classified a pixel as citrus fruit if it fell inside the thresholds; otherwise it was

classified as a background class. An example of image processing steps for a typical

citrus grove image is shown in Figure 4-10. This image is used as an example to explain

various steps involved in the implementation of the fruit counting algorithm. Figure

4-10 (a) shows a sample color image in the validation data set. Fruits were extracted by

applying binarization on the sample color image in HSI color plane. The binarized image

is shown in Figure 4-10 (b).

4.2 Preprocessing

The binarized images contained noise mainly due to the little overlap of the leaf

class with the citrus class in the hue-saturation color plane. By applying a threshold of










100 pixels based on area of the extracted features, these noises were removed from the

images in Figure 4-10 (c). The threshold was selected based on the area of the noise

pattern in the calibration images.


(a) Color image.


(b) Binanzed image.


(c) After removing noise. (d) After tilling gaps.

Figure 4-10. Image processing steps of a typical citrus grove scene.


In the above processed image, there were cases in which a single fruit occluded by small

leaves were counted as more than one fruit. To overcome this problem, a set of dilation

and erosion with a kernel size of 5x5 pixels was applied to the images, resulting in the

final processed image as shown in Figure 4-10 (d). These images could then be used to

count the number of citrus fruits by the algorithm.










Citrus fruits were identified using blob analysis and in this method, connected fruit

pixels were treated as a single blob, and the total number of blobs gave the number of

fruits in the image. Various features such as perimeter, area, horizontal width, and

vertical height of citrus fruit were calculated for each blob and were stored separately in a

text file. It should be noted that there were very few over estimations and many under

estimations by the algorithm.

The main reasons for overestimation were:

When a single fruit was hidden by many leaves and the separation between

the small blobs was more than 25 (5x5) pixels, they were counted as

different fruits.

Small fruits were not clearly visible in manual counting, however they were

counted as fruits by the machine vision algorithm.

In some images, there were many fruits hidden in dark background.

Reasons for underestimation were:

When the visible portion of a fruit was very small, it would have been

removed in the fruit counting algorithm, since a threshold in area was

carried out to remove noise.

In some cases, many fruit clusters were counted as single fruit in the

estimation by the machine vision algorithm due to connectivity.

It was found that the areas of the fruit clusters were relatively large in size

compared to other single fruits. Hence modifications were made in the fruit counting

algorithm to rectify for the underestimation problem. Since it was found from the

calibration images that there were a few fruits completely visible and all the remaining









fruits were mostly hidden by leaves, average size of a fruit in an image was calculated

based on the Hyve largest fruits in an image. If the average area was less than 200 pixels,

or if the total number of fruits in an image was less than 10, then it was decided to end

the fruit counting procedure. This was because it would be difficult to identify fruit

clusters when the leaves might have hidden all the fruits or when the imaging scene

would have been at a large distance from the camera. Otherwise the following fruit

cluster estimation module was conducted.

A threshold was calculated based on the average area of fruit and if the fruit area

was more than the threshold, it was identified as a fruit cluster and counted as two fruit

instead of one. Fruit clusters were counted only as two instead of many fruits because of

the difficulty in defining a threshold in area for multiple fruits. This would introduce

potential error in the fruit counting algorithm by under estimating the number of fruits in

an image. The threshold was selected with trial and error using the calibration images.

Table 4-3. Threshold for identification of fruit clusters.
Averae size of fruits (pxel2) Threshold (ixel2
0 200 100,000
201 600 Calculated average size
601 -1,300 800
> 1,301 1,200


The conditions for setting thresholds are shown in Table 4-3. For example, if

average area of obj ects in an image is between 601 and 1300 pixel2, and if area of an

obj ect in that image was more than threshold (800 pixels), then it would be considered as

fruit cluster and counted as two fruits instead of one. The identification of fruit clusters

was not invoked by setting the threshold equivalent to 100,000 pixels. This

underestimation problem could be solved once a separation algorithm such as the











Watershed method is developed. However, deciding on a suitable threshold for a


watershed algorithm would be difficult, since the fruits are hidden by leaves in an


irregular pattern and watershed algorithm may result in over fragmentation of normal

fruits, resulting in overestimation.


The fruit counting algorithm was applied to the validation set of 329 images and


the results are tabulated in Table 4-4. The percentage error was as low as 0% and as high


as 100% in cases where there were 1 or 2 fruits and the algorithm identified none. The


mean absolute ErrorImage WaS determined to be 29.33% for all the validation images. The


main reason for this high error was due to the fact that there were many fruits that were


very small and clear to the human eye, while the algorithm would have treated them as


noise and left them while counting the fruits. A regression analysis was conducted


between the number of fruits by manual counting and the number of fruits counted by the


fruit counting algorithm for 329 validation images, Figure 4-11. The R2 Value for the


regression analysis was 0.79.

Table 4-4. Performance comparison for 329 validation images.
Observer Observer
Image Fruit counting Image Fruit counting
Error(%) Error(%)
name Standard algorithm name Standard algorithm
1 2 3 Average deviation 1 2 3 Average deviation
100a-0 16 22 23 20 4 16 -21.3 117b-0 21 25 261 24 3 16 -33.3
100a-1 25 39 39 34 8 25 -27.2 117b-1 16 16 161 16 0 13 -18.8
100a-2 8 9 8 8 1 6 -28.0 117b-2 22 26 25 24 2 21 -13.7
100a-3r 5 6 6 6 1 3 -47.1 117b-3r 15 18 191 17 2 13 -25.0
100b-0 11 18 18 16 4 8 -48.9 118a-0 24 35 33 31 6 21 -31.5
100b-1 27 33 35 32 4 19 -40.0 118a-1 40 51 471 46 6 27 -41.3
100b-2 4 8 9 7 3 3 -57.1 118a-2 32 46 441 41 8 28 -31.1
100b-3 15 23 23 20 5 10 -50.8 118a-3r 18 21 231 21 3 13 -37.1
101a-0 17 20 19 19 2 9 -51.8 118b-0 40 53 41 45 7 30 -32.8
101a-1 22 27 26 25 3 18 -28.0 118b-1 32 45 401 39 7 26 -33.3
101a-2 32 36 371 35 3 18 -48.6 118b-2 27 42 401 36 8 26 -28.4
101a-3r 113 18 19 17 3 6 -64.0 118b-3r 18 16 171 17 1 14 -17.6
101b-0 20 27 19 22 4 10 -54.5 119a-0 23 25 241 24 1 15 -37.5
101b-1 44 46 40 43 3 22 -49.2 119a-1 36 40 371 38 2 29 -23.0
101b-3 30 24 26 27 3 21 -21.3 119a-2 20 19 191 19 1 10 -48.3













Table 4-4. Continued.

Image ObevrFruit counting Image ObevrFruit counting
nae 23 .-rseStandard agrhm Error(oo) nm 123 .rseStandard algorithm Error(oo

14b-1 28 31 34 31 3 19 -38.7 31a-0 19 29 26 25 5 26 5.4
14b-2 40 48 52 47 6 22 -52.9 31a-2 3 4 4 4 1 3 -18.2
14b-3 24 31 30 28 4 22 -22.4 31a-3r 7 10 9 9 2 5 -42.3
15a-0 18 24 22 21 3 16 -25.0 31b-0 14 19 17 17 3 13 -22.0
15a-1 26 25 24 25 1 17 -32.0 31b-1 18 26 24 23 4 8 -64.7
15a-2 15 23 21 20 4 14 -28.8 31b-2 38 51 48 46 7 33 -27.7
15b-0 30 34 33 32 2 28 -13.4 31b-3r 10 9 9 9 1 5 -46.4
15b-2 24 48 45 39 13 19 -51.3 33a-0 17 22 23 21 3 10 -51.6
16a-1 16 18 19 18 2 16 -9.4 33a-1 7 10 10 9 2 2 -77.8
16a-2 20 39 40 33 11 17 -48.5 33a-2 1 5 6 4 3 0 -100.0
16a-3 18 27 25 23 5 19 -18.6 33a-3r 11 10 10 10 1 4 -61.3
16b-0 16 28 29 24 7 10 -58.9 33b-0 22 29 30 27 4 10 -63.0
16b-1 27 29 29 28 1 17 -40.0 33b-1 11 14 16 14 3 6 -56.1
26a-0 18 24 22 21 3 20 -6.2 33b-2 32 35 33 33 2 18 -46.0
26a-1 10 15 14 13 3 9 -30.8 35a-0 23 28 25 25 3 21 -17.1
26a-2 14 20 20 18 3 17 -5.6 35a-1 36 36 36 36 0 35 -2.8
26a-3 16 16 16 16 0 10 -37.5 35a-2 7 9 9 8 1 4 -52.0
26b-0 3 6 6 5 2 3 -40.0 35b-0 7 8 8 8 1 7 -8.7
26b-1 8 11 111 10 2 6 -40.0 35b-1 10 16 13 13 3 8 -38.5
26b-2 9 31 34 25 14 8 -67.6 35b-2 17 27 22 22 5 13 -40.9
27a-0 8 7 7 7 1 7 -4.5 35b-3r 4 7 7 6 2 0 -100.0
27a-1 18 24 21 21 3 20 -4.8 36a-0 26 28 21 25 4 15 -40.0
27a-2 16 16 14 15 1 17 10.9 36a-1 13 12 13 13 1 9 -28.9
27b-0 10 10 10 10 0 9 -10.0 36a-2 14 13 15 14 1 14 0.0
27b-1 9 9 9 9 0 8 -11.1 36a-3r 16 20 15 17 3 14 -17.6
27b-2 21 23 20 21 2 17 -20.3 36b-0 18 19 14 17 3 16 -5.9
27b-3 10 11 111 11 1 8 -25.0 36b-1 19 21 16 19 3 17 -8.9
28a-0 16 21 22 20 3 9 -54.2 36b-2 34 29 28 30 3 25 -17.6
28a-1 20 34 34 29 8 24 -18.2 41a-0 21 29 27 26 4 25 -2.6
28a-2 15 25 22 21 5 16 -22.6 41a-1 20 22 25 22 3 21 -6.0
28a-3 16 31 29 25 8 13 -48.7 41a-2 27 26 23 25 2 23 -9.2
28b-0 32 48 38 39 8 28 -28.8 41a-3 11 13 12 12 1 11 -8.3
28b-1 18 28 25 24 5 19 -19.7 41b-0 17 29 22 23 6 22 -2.9
28b-2 15 26 20 20 6 15 -26.2 41b-1 18 23 17 19 3 18 -6.9
29a-0 4 6 6 5 1 7 31.3 41b-2 15 26 18 20 6 21 6.8
29a-1 4 5 5 5 1 3 -35.7 42a-1 17 24 19 20 4 18 -10.0
29a-2 13 12 11 12 1 10 -16.7 42a-2 17 19 19 18 1 19 3.6
29a-3 10 9 9 9 1 14 50.0 42a-3 15 17 15 16 1 15 -4.3
29b-0 14 15 13 14 1 8 -42.9 82a-2 23 26 24 24 2 23 -5.5
29b-1 20 21 19 20 1 9 -55.0 82a-3 22 27 24 24 3 20 -17.8
29b-2 4 7 7 6 2 3 -50.0 83a-0 21 20 25 22 3 15 -31.8
30a-0 15 19 19 18 2 15 -15.1 83a-1 18 13 13 15 3 18 22.7
30a-1 19 23 22 21 2 16 -25.0 83a-2 32 30 33 32 2 25 -21.1
30a-2 42 61 52 52 10 35 -32.3 83a-3 43 41 39 41 2 31 -24.4
30b-0 28 37 33 33 5 26 -20.4 84a-1 22 20 24 22 2 22 0.0
30b-1 22 32 29 28 5 24 -13.3 84a-2 16 16 17 16 1 10 -38.8













Table 4-4. Continued.

Image Obevr Fnxit counting Image ObevrFnxit counting
nae 23 -eceStandard aloihi Error(oo) nae123AeaeStandard algorithni Error(oo
.1. .1....deviation
42b-0 11 13 10 11 2 10 -11.8 5a-1 25 28 26 26 2 24 -8.9
42b-1 15 13 16 15 2 15 2.3 5a-2 20 24 20 21 2 19 -10.9
42b-2 17 24 21 21 4 20 -3.2 5a-3r 2 3 3 3 1 3 12.5
42b-3 19 26 26 24 4 24 1.4 5b-0 19 18 18 18 1 6 -67.3
43a-1 31 35 30 32 3 31 -3.1 5b-1 36 30 28 31 4 23 -26.6
43b-0 12 16 14 14 2 14 0.0 5b-2 8 23 17 16 8 4 -75.0
43b-1 20 24 25 23 3 19 -17.4 5b-3 23 24 21 23 2 8 -64.7
43b-2 8 10 7 8 2 8 -4.0 61a-0 4 4 4 4 0 2 -50.0
43b-3 14 21 17 17 4 15 -13.5 61a-1 7 6 5 6 1 1 -83.3
53a-0 16 21 24 20 4 19 -6.6 61a-2 8 4 4 5 2 4 -25.0
53a-2 4 5 5 5 1 5 7.1 61a-3r 8 6 6 7 1 3 -55.0
53a-3 6 7 6 6 1 6 -5.3 61b-0 11 10 10 10 1 10 -3.2
53b-0 11 9 11 10 1 8 -22.6 61b-1 9 10 10 10 1 7 -27.6
53b-1 13 18 15 15 3 13 -15.2 61b-2 110 12 10 11 1 5 -53.1
53b-2 7 7 5 6 1 5 -21.1 63a-0 9 15 13 12 3 5 -59.5
53b-3r 2 2 2 2 0 1 -50.0 63a-1 7 7 7 7 0 5 -28.6
54a-0 4 6 5 5 1 5 0.0 63a-2 5 5 5 5 0 1 -80.0
54a-1 20 26 24 23 3 22 -5.7 63a-3r 7 8 8 8 1 5 -34.8
54a-2 19 27 22 23 4 20 -11.8 63b-1 9 13 11 11 2 5 -54.5
54a-3 13 18 15 15 3 15 -2.2 63b-2 12 17 15 15 3 14 -4.5
54b-0 19 15 17 17 2 16 -5.9 63b-3 2 2 2 2 0 1 -50.0
54b-1 14 18 15 16 2 15 -4.3 64a-0 126 301 27 28 2 14 -49.4
54b-2 14 18 21 18 4 18 1.9 64a-1 23 31 27 27 4 14 -48.1
54b-3 21 25 18 21 4 22 3.1 64a-2 19 23 21 21 2 7 -66.7
56a-0 5 7 7 6 1 5 -21.1 64a-3r 9 14 14 12 3 6 -51.4
56a-1 13 13 11 12 1 10 -18.9 64b-0 16 261 22 21 5 10 -53.1
56a-2 8 11 9 9 2 7 -25 0 64b-1 30 29 31 30 1 19 -36.7
56a-3 11 14 12 12 2 18 45.9 64b-2 126 291 32 29 3 16 -44.8
56b-0 25 30 26 27 3 28 3.7 73a-0 24 25 24 24 1 19 -21.9
56b-1 9 12 10 10 2 8 -22.6 73a-2 3 3 3 3 0 2 -33.3
56b-2 14 17 15 15 2 10 -34.8 73a-3 9 7 9 8 1 6 -28.0
56b-3 5 5 5 5 0 5 0.0 73b-0 7 5 5 6 1 5 -11.8
5a-0 24 23 32 26 5 24 -8.9 73b-1 6 6 6 6 0 5 -16.7
84b-0 15 14 13 14 1 10 -28.6 73b-2 4 4 4 4 0 3 -25.0
84b-1 19 19 17 18 1 9 -50.9 73b-3 26 25 23 25 2 19 -23.0
84b-2 20 22 21 21 1 18 -14.3 74a-0 23 22 25 23 2 17 -27.1
84b-3 24 23 24 24 1 19 -19.7 74a-1 31 28 28 29 2 23 -20.7
8a-0 10 9 7 9 2 9 3.8 74a-2 16 13 15 15 2 8 -45.5
8a-1 16 17 15 16 1 9 -43.8 74a-3 23 22 25 23 2 19 -18.6
8a-2 19 17 20 19 2 10 -46.4 74b-0 8 8 9 8 1 6 -28.0
8a-3 27 33 27 29 3 25 -13.8 74b-1 14 12 12 13 1 6 -52.6
74b-2 15 13 15 14 1 7 -51.2 8b-0 18 20 20 19 1 17 -12.1
74b-3r 5 6 6 6 1 3 -47.1 8b-2 10 12 13 12 2 8 -31.4
76a-0 28 29 31 29 2 16 -45.5 8b-3 11 111 12 11 1 8 -29.4
76a-1 27 29 30 29 2 19 -33.7 90a-0 7 6 7 7 1 4 -40.0
76a-2 32 28 31 30 2 22 -27.5 90a-1 11 12 10 11 1 10 -9.1













Table 4-4. Continued.

Observer Fruit Observer Fruit
Image ~counting Error(%) Iaecounting Error(%)
Name1 23 Aerage Srnarllgorithm Naime 1 2 3 verag algorithm

76b-2 12 17 15 15 3 9 -38.6 90a-3 24 21 20 22 2 14 -35.4
77a-0 142 20 19 4 15 -19.6 90b-0 23 17 19 20 3 19 -3.4
77a-1 172 21 20 3 17 -15.0 90b-1 6 5 4 5 1 6 20.0
77a-2 16 2 21 20 4 14 -30.0 90b-2 4 2 3 3 1 3 0.0
77b-0 17 25 23 22 4 15 -30.8 90b-3 1 1 2 1 1 2 50.0
7%b-1 26 24 21 24 3 25 5.6 91a-0 23 26 29 26 3 22 -15.4
7%b-2 16 20 20 19 2 16 -14.3 91a-1 24 28 27 26 2 16 -39.2
7%b-3 8 13 11 11 3 7 -34.4 91a-2 28 36 33 32 4 21 -35.1
79a-0 3 5 5 4 1 1 -76.9 91a-3 24 21 19 21 3 15 -29.7
79a-1 1 1 1 1 0 1 0.0 91b-0 27 25 23 25 2 20 -20.0
79a-2 6 5 5 5 1 2 -62. 5 91b-2 28 29 27 28 1 21 -25.0
79a-3 8 7 7 7 1 8 9.1 91b-3 23 19 18 20 3 16 -20.0
79b-0 8 14 13 12 3 5 -57.1 92a-0 27 26 30 28 2 18 -34.9
79b-1 16 16 15 16 1 14 -10.6 92a-2 11 10 9 10 1 7 -30.0
79b-2 0 0 0 0 0 0 0.0 92a-3 18 17 16 17 1 13 -23.5
79b-3 4 4 4 4 0 3 -25.0 92b-0 21 21 21 21 0 25 19.0
7a-0 10 14 16 13 3 6 -55.0 92b-1 15 12 12 13 2 7 -46.2
7a-1 12 14 14 13 1 6 -55.0 92b-2 16 14 14 15 1 15 2.3
7a-3 2 0 2 1 1 0 -100.0 92b-3 14 14 14 14 0 9 -35.7
7b-0 3 2 2 2 1 1 -57.1 98a-1 21 39 32 31 9 22 -28.3
7b-1 2 2 2 2 0 0 -100 0 98a-2 15 33 26 25 9 15 -39.2
7b-2 5 4 6 5 1 3 -40.0 98a-3 31 45 42 39 7 31 -21.2
7b-3 6 9 10 8 2 3 -64.0 98b-0 38 51 49 46 7 34 -26.1
80a-0 18 20 18 19 1 21 12.5 98b-1 26 27 27 27 1 25 -6.3
80a-1 24 2522 24 2 25 5.6 98b-2 17 22 20 20 3 15 -23.7
80a-2 21 31 28 27 5 21 -21.3 99a-0 33 30 29 31 2 24 -21.7
80a-3 11 16 14 14 3 9 -34. 1 99a-2 30 30 30 30 0 21 -30.0
80b-0 23 2825 25 3 15 -40.8 99a-3 27 34 32 31 4 30 -3.2
80b-1 31 42 38 37 6 30 -18.9 99b-0 30 34 32 32 2 27 -15.6
82a-1 17 16 17 17 1 13 -22.0 99b-1 17 15 14 15 2 9 -41.3
12%b-
99b-2 242 25 25 2 24 -5.3 10 11 11 11 1 6 -43.8
32a
14a-0 15 25 24 21 6 13 -39.1 18-22 28 30 27 4 15 -43.8
02a
14a-1 19 21 25 22 3 12 -44.6 18-20 25 25 23 3 17 -27.1
12a
14a-2 242 28 26 2 17 -35.4 18-22 25 28 25 3 21 -16.0
128
14a-3r 14 20 22 19 4 19 1.8 18-15 17 17 16 1 17 4.1
32b
14b-0 28 42 45 38 9 21 -45.2 18-28 41 38 36 7 26 -27.1

76b-1 36 2731 31 5 25 -20.2 90a-2 23 25 23 24 1 15 -36.6
30b-2 21 25 23 23 2 19 -17.4 84a-3 13 14 12 13 1 10 -23.1

128b-2 30 361 34 33 3 23 -31.0 18-19 22 22 21 2 20 -4.8
1b
102a-3 1 6 17 19 17 2 14 -19.2 19-9 9 8 9 1 8 -7.7
1b
102b-0 110 10 10 10 0 8 -20.0 19-8 8 8 8 0 5 -37.5













Table 4-4. Continued.

Image ObevrFnxit counting Image ObevrFnxit counting
Name Standard algorithni Eromo Name Standard algorithni Eromo
1 2 \-ece &0,+1 2 3 Average deviation
102b-1 10 10 10 10 0 8 -20.0 119b-3 18 21 19 19 2 14 -27.6
102b-2 19 18 18 18 1 17 -7.3 126a-0 10 8 8 9 1 7 -19.2
102b-3r 5 5 5 5 0 4 -20.0 126a-1 6 8 7 7 1 2 -71.4
104a-0 21 16 16 18 3 15 -15.1 126a-2 8 8 8 8 0 6 -25.0
104a-1 6 6 7 6 1 4 -36.8 126b-0 10 11 10 10 1 8 -22.6
104a-2 12 14 13 13 1 6 -53.8 126b-1 5 4 4 4 1 3 -30.8
104a-3r 13 12 13 13 1 8 -36.8 126b-2 9 7 7 8 1 3 -60.9
104b-0 10 10 10 10 0 13 30.0 126b-3 12 12 11 12 1 9 -22.9
104b-1 17 13 11 14 3 15 9.8 127a-0 23 29 26 26 3 16 -38.5
104b-3 9 12 14 12 3 14 20.0 127a-1 19 19 21 20 1 14 -28.8
117a-0 40 45 43 43 3 32 -25.0 127a-3 17 28 29 25 7 13 -47.3
117a-1 29 30 31 30 1 22 -26.7 127b-0 16 18 19 18 2 9 -49.1
117a-2 35 39 41 38 3 23 -40 0 127b-1 26 36 34 32 5 22 -31.3
117a-3 20 28 26 25 4 21 -14.9 127b-2 10 11 11 11 1 8 -25.0
102a-0 20 17 19 19 2 13 -30.4 119a-3 15 23 23 20 5 15 -26.2
102a-2 22 21 21 21 1 14 -34.4 119b-0 9 13 13 12 2 5 -57.1
128b-3r 15 21 21 19 3 12 -36.8


35 *



** **
n 0 *



E 10 ** irscutb futcutn loih
z *- irscutb ersinaayi
E 5 *

0 1 1 2 2 3 3 4 4 5 5
Numbe offut onedb ri ontn loih

Fiur 4-1 ersinaayi ewe h ubro riscutdb ua
obevto n h ubr ffut one yte ri onigagrtm










4.3 Execution time for the Algorithm

Table 4-5 shows the average execution time of ten executions for each

image-processing step. Conversion from RGB to HSI color space took a maj or segment

of the execution time (65.9 %), since the algorithm needed to compute 640 x 480 pixels

for an image. Binarization was carried out using software that consisted of checking hue,

saturation, and luminance gray levels of each pixel value with the threshold and

classifying accordingly.

Table 4-5. Execution time for each image processing tp
Image processing step Avg. Execution time (ms) Percent of total time (%)
Conversion from RGB to HSI 78.8 65.9
Binarization 28.4 23.8
Remove noise 6.3 5.3
Fill gaps 3.8 3.2
Extract features and count fruits 2.1 1.8
Total time 119.5 100.0



The execution time for binarization could be reduced considerably, if it was carried out in

hardware. The time for initialization was also measured and it was 80 ms. Average

execution time including all steps was 119.5 ms. During real time Hield-testing, image

acquisition time needs to be added.

4.4 Encoder Calibration

The encoder was calibrated in the grove before the Hield-testing of the algorithm.

The truck was driven for predefined distances three times and the average of the number

of pulses generated by the encoder was used as the reference number of pulses for each

distance. Data for the encoder calibration is shown in Table 4-6. A regression analysis

was conducted over the pulses generated by the encoder for different trials, Figure 4-12.

The R2 Value for the regression analysis was 0.99.







50



D = 0.00804Np 0.02


Where D = Distance, Np = Number of pulses

Table 4-6. Encoder Calibration. The number of pulses are shown for different trials.
Distance (ft) Trial #1 Trial #2 Trial #3 Av ere
4 5060 4865 5316 5080.3
6 7074 7420 7764 7419.3
8 9461 10068 10165 9898
10 12821 12345 12460 12542


*Pulses produced by the encoder

- Pulses estimated by recaression


14000


12000


10000

-00
8000 -

400-
6000 -

0-


R = 0.99


6
Distance (ft)


Figure 4-12. Regression analysis for encoder calibration.



The linear equation from the regression analysis was used to measure the traveled


distance from a specific location by checking the number of pulses from the encoder at


regular interval. If the pulses from the encoder have reached the calculated value, then it

could be concluded that the truck has traveled the required distance from the previous


imaging location.










4.5 Prediction of Number of Fruits/Plot

In the grove where the citrus yield mapping system was tested, two trees in each

plot were designated for hand-harvesting. Those trees were hand-harvested on

Feb. 6, 2004 and number of fruits per plot (NA), average weight of fruit in a plot (AvgWt),

minimum diameter of fruit in a plot (MinD), maximum diameter of fruit in a plot

(MaxD), average diameter of fruit in a plot (AvgD), average boxes per tree per plot

(AvgBT) and number of boxes per plot (NBP) were recorded, Table 4-7. This information

was used to develop yield prediction models.

Table 4-7. Actual harvested yield data.
Plot NBP AvgBT AvgD (mm) MaxD (mm) MinD (mm) AvgWt ()NA (Frits/plt
5 6.8 3.4 67.0 74.7 60.9 163.7 847.8
8 1.9 1.0 72.9 79.3 67.6 211.4 183.5
14 7.6 3.8 67.6 74.8 61.8 165.4 938.1
15 5.0 2.5 70.8 77.5 65.2 191.2 533.9
16 6.8 3.4 66.7 74.4 60.7 159.8 868.4
26 5.0 2.5 73.1 81.4 67.0 208.0 490.6
27 7.2 3.6 69.5 79.2 62.1 179.1 820.5
28 7.6 3.8 72.6 80.4 66.4 205.7 754.0
29 2.8 1.4 74.7 80.5 69.8 223.2 256.0
30 5.4 2.7 72.9 80.0 67.3 206.5 533.9
31 6.3 3.2 67.4 74.5 61.7 169.9 756.8
33 1.0 0.5 73.9 82.5 67.5 222.5 91.7
35 5.9 3.0 70.4 78.8 64.3 189.5 635.6
36 5.9 3.0 71.6 78.2 66.3 197.7 609.3
41 9.4 4.7 68.3 77.0 61.9 170.7 1124.3
42 7.6 3.8 74.1 81.2 68.3 219.5 706.7
43 6.3 3.2 67.5 75.3 61.9 167.5 767.6
53 6.3 3.2 75.3 82.2 69.4 233.2 551.5
54 8.1 4.1 73.1 81.7 66.2 209.6 789.0
56 7.6 3.8 74.3 82.5 68.2 214.7 722.5
61 1.5 0.8 76.1 82.2 71.0 238.8 128.2
63 5.4 2.7 69.0 77.3 62.3 177.5 621.0
64 6.3 3.2 70.6 78.9 64.1 187.3 686.7
73 7.2 3.6 68.0 77.4 61.5 172.7 851.2
74 8.1 4.1 74.5 81.3 68.9 227.4 727.0
76 8.1 4.1 69.0 76.1 63.2 179.2 922.7
77 5.9 3.0 71.4 77.8 66.2 196.4 613.3
79 5.4 2.7 75.7 82.4 70.2 235.1 468.9










Table 4-7. Continued.
Plot NBP AvgBT AvgD (mm) Ma2xD (mm) M 2inD (mm) AvgWg (g) NA (Fruits/plot)
119 6.3 3.2 66.1 73.8 60.0 157.0 819.1
126 4.6 2.3 69.4 76.9 63.5 182.6 514.3
127 7.6 3.8 71.0 79.3 64.6 193.5 801.5
128 7.2 3.6 70.5 79.4 63.6 195.1 753.2
80 9.4 4.7 66.0 76.1 58.9 154.9 1238.9
82 7.2 3.6 67.8 76.8 60.7 166.3 883.6
83 7.6 3.8 70.5 78.4 64.4 188.7 822.3
84 8.1 4.1 68.5 78.7 61.8 160.4 1030.6
90 5.9 3.0 70.4 77.2 64.9 188.8 638.0
91 9.0 4.5 70.3 77.0 64.8 191.5 959.2
92 6.3 3.2 74.8 81.7 69.1 233.8 550.0
98 6.8 3.4 71.0 77.5 65.8 190.1 730.2
99 6.3 3.2 69.7 76.8 64.0 182.8 703.5
100 4.1 2.1 77.1 82.7 72.3 245.2 341.3
101 7.6 3.8 72.1 79.3 66.3 198.4 782.1
102 6.8 3.4 68.6 76.9 62.6 172.0 807.1
104 6.8 3.4 75.2 81.9 69.9 227.4 610.3
117 9.0 4.5 68.1 76.5 62.4 170.6 1076.9
118 9.4 4.7 66.5 75.0 60.3 156.9 1222.5



There were cases in which images of entire plot was not taken. For example,

images were taken only on west side of plots 82 and 83 since there were moisture sensors

(tensiometers) installed on the other side of the plot. Also on plot 43, due to the

operator' s mistake, images were taken only for one tree on one side while images were

taken for both trees on the other side. Hence, these three plots 82, 83, and 43 were

removed from the data analysis. Regression analysis was carried out between NA and the

number of fruits/plot predicted by the fruit counting algorithm. It was found out that there

was an outlier for plot 7 and subsequently it was removed from further data analysis.

Fruit/plot were predicted based on three models using the following three variables:

NPfrivts, NPplxels and NPfivats-plxels.

Fruits/plot data for the remaining 44 plots was sorted in ascending order based on

these three variables. For each model, alternate plots were chosen and combined into two










groups, and one was used as a calibration data set and the other was used as a validation

data set, Table 4-8.

Table 4-8. Number of plots in calibration and validation data sets to develop prediction
models.
Calibration Validation Total Plots
22 22 44


Fruit/plot data for the 44 plots were sorted based on these three variables. Then,

alternate plots were chosen and combined into two groups, and one was used as the

calibration data set and the other was used as validation data set so that data was evenly

distributed throughout the entire range. Regression analysis was conducted between NA

and the variables: NPfrmes, NPplxels, and NPfruits-ptxels for the calibration data set, and the

results are shown in Figure 4-13, 4-14, and 4-15. A second-degree polynomial equation

was estimated using Excel to fit the data between NA and predicted number of fruits by

NPfrmes in a plot for the calibration data set. The R2 Value for the regression analysis was

0.47.




NPfrmes = -0.0339(M~Y frult s)2 + 17.112 pyfults

where M~yfruzes = number of fruits/plot counted by the fruit counting algorithm.

A second-degree polynomial equation was estimated using Excel to fit the data between

NA and predicted number of fruits by NPpxezs in a plot for the calibration data set. The R2

value for the regression analysis was 0.32.



NPpxezs = 0.00007(M11pyxezs)2) + 0.051M~Ypxezs 79.69

where M~pxezs = number of pixels/plot counted by the fruit counting algorithm







54



2500

R2 =0.47
2000-



S1500- *



S1000-



500-




0 20 40 60 80 100 120 140 160 180
Fruits/plot estimated by number of fruits using fruit counting algorithm



Figure 4-13. Regression analysis between NA and NPfi,t,te.



3000

R2 = 0.32
2500-


? 2000-


18 1500 *


S1000-


500




0 10000 20000 30000 40000 50000 60000 70000 80000

Fruits/plot estimated by number of citrus pixels using fruit counting
algorithm


Figure 4-14. Regression analysis between NA and NPpxezs.










A second-degree polynomial equation was estimated using Excel to fit the data

between the actual number of fruits/plot and predicted by the fruit counting algorithm

based on number of fruits in a plot. The R2 Value for the regression analysis was 0.46.




Nfults-plxels = -0.1 82( 1Yfults-plxels 2 + 39.52 Ihrfults-plxels 369.62

where 2~fults-plxels = number of fruits estimated using citrus pixels/plot counted by the

fruit counting algorithm.


3000


2500


2 2000


v,1500


2U 1000


500


0


R2 = 0.46


0.0 20.0 40.0 60.0 80.0 100.0 120.0
Fruits/plot estimated based on fruits/plot using citrus pixels/plot

Figure 4-15. Regression analysis between NA and mefuls-plxels.


140.0


4.6 Yield Prediction Model

A yield prediction model was developed using u~,,tts-ptxls Variable. This variable

was preferred because it used the average size of the fruit in the estimation of number of









fruits in a plot. The R2 ValUe WaS also very high among the three approaches. The model

was applied to the 22 plots in the validation data set to estimate the number of fruits/plot.

The number of fruits/plot estimated using fruits based on pixels/plot from the machine

vision algorithm was used in the yield prediction model.

Yield predicted for the 22 plots in validation data set using NPfivats-plxels model is

tabulated in Table 4-9. The percentage error was as low as 0.1% for plot 101 and as high

as 214.8% for plot 33. The main cause for the high error rate was due to the fact that

using a single camera, it was not possible to cover the entire citrus tree. Fruits that were

inside the canopy would have been completely occluded by leaves in the images. Hence,

the fruit counting algorithm was not able to identify these occluded fruits. Yield

estimation model depends on the imaging scene of a particular tree. If large distribution

of fruits on a particular tree were not captured on the image, the model would have

predicted very less yield than the actual harvested yield. On the other hand, if a tree with

low yield had fruits distributed over a dense region that was captured using the camera,

then the model would have predicted more yield than the actual harvested yield. Since

fruits were stretched throughout the tree canopy in irregular patterns, yield estimation

based on portion of a tree was not very successful. The smallest number of images

needed to estimate yield in a grove depends on the amount of variability present in the

grove. If the variability is very high, then acquiring images of many trees is a best

approach to predict the yield accurately. On the other hand if the yield were relatively

uniform, then acquiring images of some trees would be a better option to predict yield of

the grove accurately.










Table 4-9. Performance comparison of the yield prediction model for 22 plots.
Plot Number Actual yield Predicted Yield ErrorPlot o%)
(fruits/m2) (fruits/m2)
33 4.9 15.5 214.8
79 25.2 18.9 -25.2
90 34.3 22.6 -34.1
63 33.4 26.7 -20.0
36 32.7 28.1 -14.3
27 44.1 33.3 -24.5
28 40.5 34.6 -14.7
26 26.4 36.7 39.3
99 37.8 37.4 -1.1
53 29.6 37.9 27.9
127 43.1 38.7 -1 0.2
41 60.4 39.6 -34.5
30 28.7 40.9 42.5
101 42.0 42.1 0.1
56 38.8 42.6 9.7
98 39.2 43.8 11.6
35 34.2 43.9 28.5
119 44.0 44.3 0.6
80 66.6 45.7 -31.4
91 51.6 47.5 -7.9
128 40.5 47.5 17.4
117 57.9 47.4 -18.1



Before the experiment, it was considered that keeping the camera 5.2 m high and

focusing at 45 degree with respect to ground would cover maj ority of the tree canopy.

But during the field-testing, it was found that the resolution of the image was not good

with this setup. Hence, in order to take clear and high quality images, the camera lens

was zoomed in by a factor of two, thus covering small percentage of the tree canopy. If

multiple cameras were used to cover the maj ority of the tree canopy, then the model

could be used to predict yield with improved accuracy. A regression analysis was

conducted between the yield estimated by the yield prediction model and the actual yield

for 22 plots, Figure 4-16. The R2 Value for the regression analysis was 0.46, RMSE was

45.1 fruits/meter2 and CV was 70.42%.











70.0
R2 =0.46
60.0


> 50.0


,40.0 -*.

S30.0


5j 20.0


10.0


0.0
0.0 10.0 20.0 30.0 40.0 50.0 60.0 70.0
Yield prediction model (YE)


Figure 4-16. Regression analysis between yield prediction model and the actual harvested
yield.


ArcView software from Environmental Systems Research Institute (ESRI) of

Redlands, CA, was used to create yield maps. Digital Orthographic Quarter-

Quad (DOQQ) 1-meter resolution photograph was overlaid below the yield maps (Figure

4-17, 4-18 and 4-19). For qualitative analysis, the yield data was arbitrarily classified into

three classes based on the yield distribution, Table 4-10. Actual and predicted yield for 22

validation plots classified into three classes is shown in Figure 4-17. Out of the

22 validation plots, there were seven false classifications in yield category between the

actual and predicted yield.

Table 4-10. Yield category for 22 plots.
Yield (fruits/m2) Yield category
0-20 Low
20.1 -40 Medium
40.1-70 High









Traveling speed based on the current processing time of 119 ms for an image of

1.67 m wide was determined to be 14.03 m/s (31.38 miles/hour). This would be the upper

limit if the current system were implemented in moving mode in which images would be

continuously acquired as the truck is driven in-between the tree rows in the citrus grove.

Actual yield (YA) for the 22 calibration plots and the predicted yield (YE) for the

22 validation plots are shown in Figure 4-18. Yield calculated based on images is shown

in Figure 4-19.









































100 0 100 Meters

Figure 4-17. Performance of yield prediction model (fruits/m2).


Performance of yield prediction model


Actual yield
1I Low aj
2 Medium
a High
Predicted yield
Low
al Medium
a High








































Actual yield (fruits/square meter)
a 6.9 -18.3
S18.3 -33
1 33 -43.4
@ 43.4 -55.4
i 55.4 -65.7
Predicted yrield (fruits/square meter)
o 15.5 -18.9
S18.9 -28.1
O 28.1 -39.6
39.6 -44.3
00 0 100 Meters 44.3 -47.5


Figure 4-18. Yield mapping for citrus fruits (fruits/m2).


Yield mapping for citrus




































100 0 100 Meters


N











Yield (fruitalsquare meter)
e 0 -1.3
O 1.3 -2.6
O 2.6 -3.9
O 39 5-2
O 5.2 -7.7


Figure 4-19. Yield based on number of citrus fruits in an image (fruits/m2)


Yield based on number of fruits in an image











4.7 Yield Variability based on Rootstock Variety

Fruits/plot based on rootstock variety were subj ected to SAS ANOVA PROC. The results

showed that yield from Cleopatra mandarin and Swingle citrumelo were grouped together

and were sufficiently higher than Carrizo citrange variety. The mean fruits/plot details for

the three different rootstock variety is shown in Figure 4-20.

Table 4-11. Fruits/plot for 48 plots groupe using means test.
Mean
Number
Tukey Grouping number of Citrus Rootstock
of plots
fruits/plot
A 1608.5 16 Cleopatra mandarin
A 1649.8 16 Swingle citrumelo
B 1036.5 16 Carrizo citrange


1800

1600

1400

3 1200
-
1000 -
-
E 800 -

600 -
-
400 -

200

0


Cleopatra mandarin Swingle citrumelo Carrizo citrange
Rootstock Variety


Figure 4-20. Yield variability based on rootstock variety.















CHAPTER 5
SUNIMARY AND CONCLUSIONS

This work presented a machine vision algorithm to identify and count the number

of citrus fruits in an image and Einally to estimate the yield of citrus fruits in a tree.

Within this section, a summary of the work and the conclusions drawn from it are

presented. Improvements and modifications to the existing experiment are suggested.

5.1 Summary and Conclusions

The fruit counting algorithm developed in this research verified the feasibility of

developing a real-time machine vision system to estimate citrus yield on-the-go. The

algorithm consisted of image acquisition, binarization of color image in hue-saturation

color plane, pre-processing to remove noise and to fi11 gaps, and, Einally, counting the

number of fruits. Pixel values for three classes (citrus, leaf, and background) was

manually taken from 25 calibration images and used for binarizing the color image in hue

and saturation color plane to separate the citrus fruits from the background. It was found

that the fruit, leaf, and background portion of the citrus grove scene was clearly classified

using thresholds in hue-saturation color plane.

A threshold in fruit area was applied over the binarized image to remove noise and

a combination of dilation and erosion to enhance the image for proper identification of

the citrus fruits. Blob analysis was used to identify the citrus fruits and the total number

of blobs gave the number of citrus fruits in an image. A cluster of fruits was identified

partially using the average area of a fruit and counted as two fruits instead of one in the

algorithm. The total time for processing an image was 119.7 ms without image









acquisition time. The algorithm was tested on 329 validation images and the R2 ValUe

between the number of fruits counted by the machine vision algorithm and the average

number of fruits counted by human observers was 0.79. The variation in the number of

fruits correctly classified was partially due to clusters of citrus fruits, uneven lightning

and occlusion.

Images belonging to a same plot were grouped together and the data from 22 plots

were used to predict fruit/plot with the following three variables:

1) Number of fruit estimated using fruit counting algorithm (NPfruzes),

2) Number of citrus pixels/plot estimated using fruit counting algorithm (NPpxezs)

3) Number of fruits/plot estimated using citrus pixels/plot data (NPfruits-plxels).

Yield prediction model was developed using NPfruits-ptxels Variable. The model was

applied over 22 plots and the R2 value between the yield predicted by the model and the

actual harvested yield was 0.46. The main cause for low R2 was due to the fact that using

a single camera, it was not possible to cover the entire citrus tree. Further, fruits that were

inside the canopy would have been completely occluded by leaves in the images. Hence,

the fruit counting algorithm was not able to identify these occluded fruits. The results

indicate that the yield prediction model could be enhanced by using multiple cameras for

covering the maj ority of tree canopy.

5.2 Future Work

Highly non-uniform illumination in an image presented a problem for color vision

based segmentation approach. One improvement to the present system would be to

improve the imaging of natural outdoor scenes with wide variation in illumination.

Automatic brightness control before imaging could be implemented by using a

phtotransistor to measure the intensity of the imaging scene and sending control to the










camera to change its shutter speed/brightness level. Study should be conducted to

determine whether night time imaging with the machine vision system using artificial

lighting improves the image acquisition module.

Two other areas for future work deal with the problems of clustered fruits and with

fruits that are partially occluded from view. Statistical estimates should be developed to

account for occluded fruits and fruit clusters in order to classify all the fruits in an image

correctly. Another improvement would be to use multiple cameras to capture the entire

portion of citrus trees. This would increase the correlation between the number of fruits

on a tree and the number of fruits estimated by the yield prediction model. An

ultrasonic/laser sensor could be used to measure the distance between the camera and the

imaging scene to measure the size of the fruits.
















LIST OF REFERENCES


Brown, G. K. 2002. Mechanical harvesting systems for the Florida citrus juice industry.
ASAE Paper No. 021108. St. Joseph, MI.: ASAE.

Campbell, R. H., S. L. Rawlins, and S. Han. 1994. Monitoring methods for potato yield
mapping. ASAE Paper No. 94-1584. St. Joseph, MI.: ASAE.

Casasent, D., A. Talukder, W. Cox, H. Chang, and D. Weber. 1996. Detection and
segmentation of multiple touching product inspection items. In Proceedings of the
Society for Photo-Optical Instrumentation Engineers, volume 2907, pages 205-215,
1996.

Coppock, G. E., and J. R. Donhaiser. 1981. Conical scan air shaker for removing citrus
fruit. Transactions of the ASAE 24 (6): 1456-1458.

Florida Agriculture Statistic Services. 2001. Citrus Summary 2000-01. Florida
Department of Agriculture and Consumer Services. Tallahassee, Florida.

Florida Agriculture Statistic Services. 2002. Commercial Citrus Inventory--2002.
Florida Department of Agriculture and Consumer Services. Tallahassee, Florida.

Gonzalez, R.C. and R.E. Woods. 1992. Digital Image Processing. Reading, MA:
Addison-Wesley Publishing Company.

Harrell, R. C., P. D. Adsit, T. A. Pool, and R. Hoffman. 1990. The Florida robotic grove-
lab. Transactions of the ASAE 33: 391-399.

Hodges, A., E. Philippakos, D. Mulkey, T. Spreen, and R. Muraro. 2001. Economic
impact of Florida' s citrus industry, 1999-2000. Economic Information Report 01-2.

Juste, F., C. Gracia, E. Molto, R.Ibanez, and S.Castillo. 1988. Fruit bearing zones and
physical properties of citrus for mechanical harvesting. Proceedings of the
International Society of Citriculture 4: 1801-1809.

Lee, W. S., T. F. Burks, and J. K. Schueller. 2002. Silage yield monitoring system. ASAE
Paper No. 021165. St. Joseph, MI.: ASAE.

Nassua, K. 1980. The causes of color. Scientific American 243(4): 124-153.

Parrish, E., and A.K. Goksel. 1977. Pictorial pattern recognition applied to fruit
harvesting. Transactions of the ASAE 20: 822-827.










Pelletier, G., and S. K. Upadhyaya. 1999. Development of a tomato load/yield monitor.
Computersa~ndElectronics in Agriculture 23(2): 103-1 18.

Pollock S. L., B. Lin, and J. Allshouse. 2003. Characteristics of U. S. orange
consumption. United States Department ofAgriculture, FTS 305-01.

Roades, J. P., A. D. Beck, and S. W. Searcy. 2000. Cotton yield mapping: Texas
experience in 1999. In Proceedings of the Beltwide Cotton Conference, 404-407
San Antonia, TX. Jan 4-8, 2000. Memphis, TN: National Cotton Council of
America.

Salehi, F., J. D. Whitney, W. M. Miller, T. A. Wheaton, and G. Drouillard. 2000. An
automatic triggering system for a citrus yield monitor. ASAE Paper No. 001130.
St. Joseph, MI.:ASAE.

Schertz, C. E., and G. K. Brown. 1966. Determining fruit-bearing zones in citrus.
Transactions of the ASAE 9: 366-368.

Schertz, C. E., and G. K. Brown. 1968. Basic considerations in mechanizing citrus
harvest. Da~nsactions of the ASAE 11(2): 343-348.

Schueller, J. K., and Y. H. Bae. 1987. Spatially attributed automatic combine data
acquisition.Computers and Electronics in Agriculture 2: 1 19-127.

Schueller, J. K., J. D. Whitney, T. A. Wheaton, W. M. Miller, and A. E. Turner. 1999.
Low-cost automatic yield mapping in hand-harvested citrus. Computers and
Electronics in Agriculture 23(2): 145-154.

Searcy, S. W., J. K. Schueller, Y. H. Bae, S. C. Borgelt, and B. A. Stout. 1989. Mapping
of spatially-variable yield during grain combining. Da~nsactions of the ASAE 32(3):
826-829.

Shatadal, P., D. S. Jayas, and N. R. Bulley. 1995. Digital image analysis for software
separation and classification of touching grains: I. Disconnect algorithm.
Transactions of the ASAE 3 8(2): 63 5-643.

Slaughter, D. C., and R. Harrell. 1987. Color vision in robotic fruit harvesting.
Transactions of the ASAE 30(4): 1144-1148.

Slaughter, D. C., and R. Harrell, 1989. Discriminating fruit for robotic harvest using color
in natural outdoor scenes. Transactions of the ASAE 32(2): 757-763.

Whitney, J. D., and H. R. Sumner. 1977. Mechanical removal of fruit from citrus trees.
Proc. Int. Soc. Citriculture 2: 407-412.

Whitney, J. D., T. A. Wheaton, W. M. Miller, and M. Salyani. 1998. Site-specific yield
mapping for Florida citrus. Proc. Fla. State Hortic. Soc. 111: 148-150.










Whitney, J. D., Q. Ling, T. A. Wheaton, and W. M. Miller. 1999. A DGPS yield
monitoring system for Florida citrus. AppliedEngmneering in Agriculture 17(2):
115-119.

Whittaker, A. D., G. E. Miles, O. R. Mitchell, and L. D. Gaultney. 1987. Fruit location in
a partially occluded image. Transactions of the ASAE 30(3): 591-597.

Wilkerson, J. B., J. S. Kirby, W. E. Hart, and A. R. Womac. 1994. Real-time cotton flow
sensor. ASAE Paper No. 94-1054. St. Joseph, MI.: ASAE.
















BIOGRAPHICAL SKETCH

The author was born in 1980 in Chennai, India. He graduated with a Bachelor of

Science in electronics and communication engineering degree in May 2001 from the

Government College of Technology, Coimbatore, India. He then obtained Master of

Science degrees in electrical and computer engineering in December 2003 from the

University of Florida.