Estimation of Count and Mass of Citrus Fruit Drop Using Machine Vision

MISSING IMAGE

Material Information

Title:
Estimation of Count and Mass of Citrus Fruit Drop Using Machine Vision
Physical Description:
1 online resource (66 p.)
Language:
english
Creator:
Choi, Daeun
Publisher:
University of Florida
Place of Publication:
Gainesville, Fla.
Publication Date:

Thesis/Dissertation Information

Degree:
Master's ( M.S.)
Degree Grantor:
University of Florida
Degree Disciplines:
Agricultural and Biological Engineering
Committee Chair:
LEE,WON SUK
Committee Co-Chair:
EHSANI,REZA JOHN
Committee Members:
ROKA,FRITZ MICHAEL

Subjects

Subjects / Keywords:
cmnp -- hlb -- illumination-normalization -- image-processing -- outdoor-imaging -- varying-illumination
Agricultural and Biological Engineering -- Dissertations, Academic -- UF
Genre:
Agricultural and Biological Engineering thesis, M.S.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

Notes

Abstract:
A machine vision system for estimating citrus fruit drop count and its mass was developed. The objectives were to design rugged hardware for outdoor imaging, to develop an image processing algorithm to estimate count and mass of the citrus fruit drop and to create a geo-referenced fruit drop map. Image acquisition hardware was developed to be used in a commercial citrus grove specifically for outdoor imaging conditions. The image processing algorithm included normalization of illumination, citrus detection using a logistic regression classifier, and least square circle fitting for estimating size and mass of the citrus. Performances of the machine vision algorithm were analyzed using two different criteria. Firstly, capability of detecting citrus without any missed fruit which was denoted as accuracy was analyzed. The accuracy varied in experiment trials, and the highest accuracy was 89 percent while the lowest was 59 percent. The second criterion was analysis of false positives which represented incorrect detection of background objects as citrus. The percentage of false positives also varied between the trials, the highest error was 30 percent and the lowest error was 2.6 percent. Result of the experiments showed that each area in citrus groves had different count and mass of the citrus drop. This is because each area in fields had different site-specific variable factors such as nutrient levels, soil pH, diseases such as citrus greening, canopy size etc. Among those factors, the result showed that especially spraying CMNP before harvesting caused significant drop. However, CMNP sprayed for past couple of years did not have specific effect on citrus drop.
General Note:
In the series University of Florida Digital Collections.
General Note:
Includes vita.
Bibliography:
Includes bibliographical references.
Source of Description:
Description based on online resource; title from PDF title page.
Source of Description:
This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Statement of Responsibility:
by Daeun Choi.
Thesis:
Thesis (M.S.)--University of Florida, 2013.
Local:
Adviser: LEE,WON SUK.
Local:
Co-adviser: EHSANI,REZA JOHN.

Record Information

Source Institution:
UFRGP
Rights Management:
Applicable rights reserved.
Classification:
lcc - LD1780 2013
System ID:
UFE0046014:00001


This item is only available as the following downloads:


Full Text

PAGE 1

1 ESTIMATION OF COUNT AND MASS OF CITRUS FRUIT DROP USING MACHINE VISION By DAEUN CHOI A THESIS PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF MASTER OF SCIENCE UNIVERSITY OF FLORIDA 2013

PAGE 2

2 2013 Daeun Choi

PAGE 3

3 To my family and love, for all their support

PAGE 4

4 ACKNOWLEDGMENTS I would like to express my sincere gratitude to all of the people that have helped me to accomplish yet amazing experience during the study It would not be possible if I did not have support from people around me. First and foremost I would like to thank my advisor Dr. Won Suk Lee for providing me a chance to pursue my graduate studies, also for his guidance and support throughout my research and thesis. I would like to thank to my supervisory committee members, Dr. Reza Ehsani and Dr. Fritz Roka for their advices. I also would like to express my a ppreciation for having amazing lab mates and for their amazing ideas shared with me. I really enjoyed our conversation s and discussion s all the time. I am thankful for Mr. Zingaro as well for helping m e to finish my hardware design. Lastly, I would like to express my deepest a ppreciation to my parents and fianc for always loving me and believing in me. I cannot imagine how hard it wo uld be without them by my side.

PAGE 5

5 TABLE OF CONTENTS Page ACKNOWLEDGMENTS ................................ ................................ ................................ .. 4 LIST OF TABLES ................................ ................................ ................................ ............ 7 LIST OF FIGURES ................................ ................................ ................................ .......... 8 ABSTRACT ................... 9 CHAPTER 1 INTRODUCTION ................................ ................................ ................................ ........ 12 Citrus Greening Disease (HLB) ................................ ................................ .............. 13 Precision Agriculture ................................ ................................ ............................... 14 Mechan ical Harvesting ................................ ................................ ............................ 14 Machine vision ................................ ................................ ................................ ........ 15 Thesis Objective ................................ ................................ ................................ ..... 16 Related Literatures ................................ ................................ ................................ 16 Citrus Detection and Counting Algorithm ................................ .......................... 16 Various Frui t Detection Algorithm ................................ ................................ ..... 17 Citrus Mass Estimation Algorithm ................................ ................................ ..... 19 Comparison of the research in the thesis and related articles .......................... 19 2 MATERIALS AND METHODS ................................ ................................ ................... 21 Image Acquisition ................................ ................................ ................................ ... 21 Hardware ................................ ................................ ................................ .......... 21 Image Acquisition Software ................................ ................................ .............. 25 Fi eld Experiment ................................ ................................ .............................. 26 Machine Vision Algorithm ................................ ................................ ....................... 28 Image Used in Algorithm ................................ ................................ .................. 29 RGB Color Space and Intensity ................................ ................................ ........ 29 Normalization of illumination ................................ ................................ ............. 34 ................................ ............... 36 Classification ................................ ................................ ................................ .... 41 Citrus Classification Using Entropy Texture Analysis ................................ ....... 44 Counting and Mass Estimation ................................ ................................ ......... 45 3 RESULT AND DISCUSSION ................................ ................................ ..................... 48 4 CONCLUSION ................................ ................................ ................................ ........... 61 LIST OF REFERENCES ................................ ................................ ............................... 64

PAGE 6

6 BIOGRAPHIC AL SKETCH ................................ ................................ ............................ 66

PAGE 7

7 LIST OF TABLES Table Page 1 1 Florida 2012/2013 Estimated Production Drop (Unit: thousand Boxes, USDA NASS, 2013). ................................ ................................ ................................ ...... 13 2 1 RGB values of citrus, leaf, dead leaf and soil in normalized image. ................... 36 3 1 Measurement of diameter in number of pixels for three different sizes of white balls. ................................ ................................ ................................ .......... 48 3 2 2D look up table (bilinear) for estimating actual size of detec ted fruit. ................ 49 3 3 Measured diameter (cm), actual mass (g) and estimated value ......................... 49 3 4 Performance of algorithm for accuracy (skirted canopy, Lykes grove). .............. 53 3 5 Performance analysis for false positives (skirted canopy, Lykes grove). ............ 54 3 6 Performance analysis for ability to avoid missing fruit and estimated mass (un skirted canopy, Duda grove). ................................ ................................ ....... 54 3 7 Performance analysis for ability to avoid false positive error (un skirted canopy, Duda grove). ................................ ................................ ......................... 55

PAGE 8

8 LIST OF FIGURES Figure Page 1 1 Citrus production in Florida, California and Texas states (USDA NASS, 2013) 12 1 2 Canopy shakers and catch frame. ................................ ................................ ...... 15 2 1 Figure Camera used in the experiment (NI 1772C Smart Camera, National Instrument Corporation).. ................................ ................................ .................... 22 2 2 Input and output of cameras. ................................ ................................ .............. 23 2 3 Encoder attached on a small tire on the back side of truck. ............................... 24 2 4 Final hardware setup. ................................ ................................ ......................... 25 2 5 Front panel of image acquisition software. This software includes image display, input status and GPS coordinate indicator. ................................ ........... 26 2 6 View of skirted canopy in commercial citrus grove for mechanical harvesting and dropped fr uit due to the disease and CMNP ................................ ................ 27 2 7 Tub for Hand Harvesting. ................................ ................................ ................... 27 2 8 Goat truck ................................ ................................ ................................ ........... 28 2 9 Flow chart of machine vision algorithm. ................................ .............................. 28 2 10 Image quality comparison. ................................ ................................ .................. 29 2 11 Histogram of R, G and B component : citrus, leaf, dead leaf, twig, tree and soil ................................ ................................ ................................ ...................... 31 2 12 Color variation in R, G, and B components according to different average intensity values. ................................ ................................ ................................ .. 33 2 13 Exampl e image of varying illumination ................................ ............................... 35 2 14 Normalization of illumination. ................................ ................................ .............. 36 2 15 Histogr ................................ ............................. 39 2 16 Histogra ................................ ......................... 40 2 17 Simple logistic regression classifier model. ................................ ........................ 42 2 18 Exa mple of classification result. ................................ ................................ .......... 43

PAGE 9

9 2 19 Visualization of entropy value. Brighter pixel represents the greater value. ....... 45 2 20 Morph ological operations after entropy filter. Filling holes and removing small objects. ................................ ................................ ................................ ............... 45 2 21 Calibration images ................................ ................................ .............................. 47 3 1 Regression of the size and the mass of actual citrus measurements. ................ 51 3 2 Final result example. ................................ ................................ .......................... 52 3 3 Full view of fruit drop map in Lykes grove. ................................ ......................... 57 3 4 Zoomed view of fruit drop map in Lykes grove. ................................ .................. 58 3 5 Full view of fruit drop map in Duda grove ................................ ........................... 59 3 6 Zoomed view of fruit drop map in Duda grove. ................................ ................... 60

PAGE 10

10 Abstract of Thesis Presented to the Grad uate School of the University of Florida in Partial Fulfillment of the Requirements for the Degree of Master of Science ESTIMATION OF COUNT AND MASS OF CITRUS FRUIT DROP USING MACHINE VISION By Daeun Choi December 2013 Major: Agricultural and Biological Engineering A machine vision system for estimating citrus fruit drop count and its mass was developed. The objectives were to design rugged hardware for outdoor imaging to develop an image processing algorithm to estimate cou nt and mass of the citrus fruit drop and to create a geo referenced fruit drop map. Image acquisition hardware was developed to be used in a commercial citrus grove specifically for outdoor imaging conditions. The image processing algorithm include d normal ization of illumination citrus detection using a logistic regression classifier, and least square circle fitting for estimating size and mass of the citrus Performances of the machine vision algorithm w ere anal yzed using two different criteria Firstly, capability of detecting citrus witho ut any missed fruit which was denoted as accuracy was analyzed The accuracy varie d in experiment trials and the highest accuracy was 89 percent while the lowest was 59 percent. The second criterion was analysis of false positive s which represent ed incorr ect detection of background object s as citrus. T he percentage of false positives a lso varied between the trials, t he highest error was 30 percent and th e lowest error was 2.6 percent.

PAGE 11

11 Result of the ex perim ents showed that each area in citrus grove s had different count and mass of the citrus drop. T his is because each area in fields had different site specific variable factors such as nutrient level s soil pH, disease s such as citrus greening canopy size etc. Among those factors, the result showed that especially spraying CMNP before harvesting caused significant drop. However, CMN P sprayed for past couple of years did not have specific effect on citrus drop.

PAGE 12

12 CHAPTER 1 INTRODUCTION Citrus fruit is one of valuable crops in agricultur al and international fruit markets. The consumption of citrus has exhibited world wide growth since the late 19 th century. C itrus fruit can be produced only in particular areas in the world due to the geographical and climate reasons. The most si gnificant amount of citrus productio n is from certain countries such as the United States, Brazil and countries near the Mediterranean. The United States was the second largest producer in 2012 with a total of 8.2 million tons, which amounts to 15% of the total world citru s production In th e United States, t he state of Florida is the largest citrus producer. According to the United States Department of Agriculture (USDA) National Agricultural Statistic s Services (NASS), 146 million boxes of citrus were produced in 2011 2012 in Florida (Fig ure 1 1) which supported 71 percent of the entire production in the United States Figure 1 1 Citrus production in Florida, California and Texas states (USDA NASS, 2013) 133700 140500 146600 57500 62500 59000 1635 1949 1419 0 20000 40000 60000 80000 100000 120000 140000 160000 2009/2010 2010/2011 2011/2012 Citrus Production (1000 boxes) Year Florida California Texas

PAGE 13

13 Citrus Greening Disease (HLB) Huanglongbing (HLB) which is an exo tic disease in citrus, also called citrus greening has spread in an extremely rapid manner This disease often causes small, bitter taste and improper colored fruit. Additionally, HLB is considered one of main reason of early citrus fruit drop which c onsequently decreases the production Table 1 1 illustrates the decrease in estimated production for a 2012 / 2013 season. In October in 2012, the USDA expected 74 million boxes of non Valencia orange and 80 million boxes of Valencia oranges to b e p roduced in Florida However, mostly because of early citrus fruit drop from a bad combination of the HLB and unfavorable weather, estimated production dropped to 67 million boxes of non Valencia and 71 million boxes of Valencia in April, 2013. Table 1 1 Florida 2012/2013 Estimated Production Drop (Unit: thousand Boxes USDA NASS 2013 ) T o facilitate management of the HLB for growers in citrus groves and anticipate change in future production evaluating how many and where t he early fruit drop is occurring is an important task. In other words, estimation of the amount of yield decrease resulting from early citru s fruit drop in specific location would help t he citrus growers so that citrus growers can manage the disease efficiently Once the accurate estimation of the fruit is obtained, efficient management of the HLB can be achieved by creating a geo referenced f ruit drop map which is one of major concept s in precision agriculture. October April Non Valencia 74000 67000 Valencia 80000 71000 All Oranges 154000 138000

PAGE 14

14 Precision Agriculture P recision agriculture which is also called site specific crop management is the management of inputs for crop production in site specific way (John Deere, 2010). U nder the site specific management, each zone is treated differently corresponding to its characteristic. This characteristic includes soil nutrient, amount of yield, soil pH, diseases, moisture contents, etc. The first step of implementing precision agricu lture is creating an in field spatial variability map. The spatial variability map can be created using Geographic Information System (GIS). Based on characteristics of each zone, Variable Rate Technolo gy (VRT) system distributes inputs according to in fie ld spatial variability of different factors incorporated with a GPS receiver so tha t the crop can be grown in an optimal condition Mechanical Harvesting About 93 percent of citrus are harvested manually. Manual harvesting involves a significant labor and time. In an effort to reduce costs of harvesting mechanical citrus harvesting has been implemented as an alternative. One of the mechanical harves ter continuous canopy shake and catch harvesters are shown in Figure 1 2 A couple of canopy shakers work s at the same time, each of them shakes a bed and swell sides of the canopy simultaneously. The harvested fruit were caught in a catch frame To evaluate performance of the mechanical harvester count and mass of citrus fruit harvested by the mechanical har vester should be measured. However, the canopy shakers could not catch all the harvested fruit since there was a gap between two catch frames allowing the fruits to fall down to the ground. To obtain an accurate estimate of the efficiency of the mechanical harvester, count and mass of the dropped fruit on the ground also need to be counted.

PAGE 15

15 Figure 1 2 Canopy shakers and catch frame Additionally, to make mechanical harvesting more efficient, a chemical compound, 5 chloro 3 me thyl 4 nitro 1H pyrazole (CMNP), was deve loped to remove fruit with less force However, the CMNP has a downfall causing significant amount of early fruit drop from trees before harvesting. In order to identify effect of spraying CMNP, estimation of the am ount of early citrus fruit drop is also important. Machine vision Due to huge production areas manual estimation of the citrus fruit drop is considered as a time consuming and labor intensive task. Automation of fruit drop count and mass estimation can be accomplished using machine vision. Machine vision is a tool consisting of imaging devices and processor s for automatic visual inspection such as surveillance, vehicle guidance, process control etc. Machine v ision can also provide accurate information for precision agriculture in many applications such as sorting fruit and yield estimation (Yang et al., 2012) and disease detection (Li et al., 2013, Pourreza et al, 2013).

PAGE 16

16 Thesis Objective Overall goal of this research was t o develop an automation system for identifying count and mass of citrus f ruit drop using machine vision Additionally, specific objectives were as follows. 1. To build a rugged hardware system for image acquisition in a citrus grove, 2. To develop a machine vision algorithm that will successfully estimate the number of citrus fruit along with their mass, and 3. To c reate spatial variability map for citrus fruit drop. These objectives will benefit citrus growers for several reasons such as, 1. To identify efficiency of citrus mechanical harvesters, and 2. To analyze an impact of CMNP or HLB in early fruit drop. R elated Literatures Citrus Detection and Counting Algorithm Annamalai and Lee (2003) developed a citrus yield mapping system using machine vi sion. The objective was to develop a machine vision system for detecting citrus and to count citrus fruit in images to provide an estimated yield map effectively. The algorithm deal t two types of images: close view images containing 2 or 3 oranges and norm al view images including 8 to 10 oranges. The algorithm consisted of thresholding in a Hue Saturation plane and counting citrus pixels. The coefficient of determination ( R 2 ) was equal to 0.76 between manual counting of citrus in the images a nd counting by the algorithm. The source of errors was citrus fruit occluded by leaves. Also, Annamalai and Lee (2004) developed a real time system for estimation of citrus yields. The algorithm was developed in a similar way to a study reported in 2003. However, the R 2 value was 0.42 between counts by the algorithm and actual harvested

PAGE 17

17 citrus and 0.53 between yield production model developed in this study and actual data. They pointed out that errors were from a limited number of cameras Ano t her source of error was counting method for overlapped citrus. The algorithm could not count the citrus when more than 3 fruits were overlapped. Patel et al. (2012) suggested automatic segmentation for yield estimation of various fruit such as citrus, app le, pomegranate, peach, and plum u sing color and shape. I mages were acquired in RGB color space and then wer e transformed into the La b color space. C that were used to detect fruit region C ircle fitting was performed to find fruits in the image s However, the average error in validati on set was high (31%) since small oranges were not detected by the algorithm. Various Fruit Detection Algorithm Stajnko et al. (2003) developed a method to estimate the number of apple an d its size in an orchard. T hermal images of t emperature gradient between fruit and b ackground was used For fruit detect ion algorithm simple thresholding was conducted in RGB color space and a normalized difference index (NDI). The coefficient of determination (R 2 ) ra nged from 0.83 to 0.88 between manually counted number and e stimation by the algorithm. An a Based on the longest segment me asurement, a diameter was estimated by multiplying it to the pixel/mm proportion. For the diameter estimation, R 2 ranged between 0.68 and 0.70. Another apple detection and counting algorithm was developed by Zhao et al. (2005). An algorit hm to detect apple fruit on tree s was developed. A stereo vision was used to acquire images with accu rate apple position R edness was used to differe ntiate

PAGE 18

18 apples from background. Texture analys is was also applied to find edge s which were combined with the redness After de tection, circle fitting was performed to find th e position of the apples in images. Ling et al. (2004) detected mature tomato in grayscale images obtained from dividing R component by G component in RGB color space. By this method, different illumination conditions were removed to detect mature tomatoes. Edges were found by chain code technique then Circular Hough Transform (CHT) and chord construction were applied to find center location of the tomatoes. Res ult from chord construction algorithm was less accurate than CHT. However, chord construction was faster and more suitable for real time system. Chi et al. (2004) developed chord reconstruction algorithm for detecting tomato e s by circle finding. This algo rithm was developed for robotic harvesters. The result was compared with circular Hough Transform (CHT). The chord reconstruction was more suitable for real time system since the processing time was much faster than CHT. Also, the algorithm was able to det ect occluded tomatoes. However, the chord reconstruction method had less accuracy than CHT. Yang et al. (2007) developed color segmentation (CLG) method to detect tomatoes in images. The CLG method was based on color structure code (CSC) which links or co nnects regions according to similarity of color. In CLG, a group of pixels which were called island grew or connected if color distance between same level islands was within threshold. After growing, islands which were under threshold size were discarded a nd the rest of them was considered as tomato objects.

PAGE 19

19 Citrus Mass Estimation Algorithm Chinchuluun et al. (2009) developed a machine vision system to identify citrus fruit and its size on a canopy shaker and catch harvester. An image acquisition system wa s developed and the system was mounted on a canopy shaker and a catch harvester. Citrus detection and size identification algorithm was developed using a Bayesian classifier. Firstly, color information was collected in Hue, Saturation and Intensity (HSI) c olor space and YIQ ( Luma and Chrominance) color space. After fruit pixels were detected watershed segmentation was implemented and a total area of citrus fruit was measured. R 2 was 0.962 between actual weight and the sum o f the total citrus fruit area. R 2 value between actual weight and the sum of fruit diameter was 0.963 and R 2 value between citrus fruit count and actual weight was 0.892. Shin et al. (2012) developed a system for postharvest citrus mass estimation. For citrus detection, color information in Hue, Saturation and Intensity (HSI) color space and luminance, chrominance in blue and chrominance in red (YCbCr) color space was used. L ogistic regression method was used for a classifier. During segmentation, high saturation pixels on fruit area were recovered by highly saturated area recovering algorithm (HSAR). Watershed algorithm using H minima transform was used to segment connected fruit in order to ide ntify the number of fruit and fruit sizes. Comparison of the R esearch in the T hesis and R elated A rticles The studies by Chinchuluun et al. (2009) and Shin at al. (2012) are similar with this research in terms of objective s whic h were citrus fruit detection and mass estimation Images in introduced studies were taken in an enclosed box so that the fie ld of view was narrower and illumination condition was under control. However, this research dealt wi th outdoor imaging which had extremely varying illumination condition

PAGE 20

20 an d wide field of view. Since color s of objects in images varied significantly in non uniform illumination conditions, more sophisticated and reliable algorithm towards illumina tion condition was developed for this research S tudies by Annamalai and Lee (2003, 2004), Patel at al. (2012) Stajnko et al. (2003) and Zhao et al. (2005) were c onducted in outdoor conditions The images were acquired with close and narrow fiel d of view so that the fruits size was relatively large to be detected. However, experiments in this study deal t with distant citrus fruits in images so the size of fruits wa s very small. A dditionally, images included more complicated the background containing many objects such as soil, grass and irrigation pipe.

PAGE 21

21 CHAPTER 2 MATERIALS AND METHODS Image Acquisition Hardware The mach ine vision system consisted of two parts. In th is chapter, the image acquisition equipment and camera triggering device are illustrated. T he machine vision system had two color CCD cameras with a micro processor, two VGA monitors, metal mounting frames for vehicle and an encoder. The cameras that were u sed were smart cameras (NI 1772C, National Instruments Corporation, Austin TX), which contained its own micro processor (1.6 GHz, Intel Atom) The cameras had images. The camera was designed to withstand harsh condition which was the main reasons for being used in this study. The majority of industrial cameras were designed for indoor applications which would result in failure when used in outdoor situations. This study require d acq uiring images in commercial citrus grove whose environmental situation ma de image acquisition a tough task because of sandy, dusty and wet conditions. Figure 2 1 ( A ) shows a frontal view of the camera. A metal shield was installed on the top of the cameras to protect from tree branches and dew. Figure 2 1 ( B ) illustrates side view of camera mounted on pan and tilt frame. The camera ha d three connection lines to a terminal block, VGA/USB and Ethernet. The terminal block had eight isolated input/output channe ls and power supply

PAGE 22

22 A B Figure 2 1 Figure Camera used in the experiment (NI 1772C Smart Camera, National Instrument Corporation) A) front side of camera with protection shield on top and connection lines (1. Terminal block, 2: Gigabit Ethernet, 3.USB/VGA) B ) S ide view of camera mounted on the pan and tilt frame. Photo courtesy of author, Daeun Choi.

PAGE 23

23 A B Figure 2 2 Input and output of cameras. A ) T erminal block of the cameras attached on the mounting frame B ) S chematic flow chart of input and output of camera These three lines were in charge of transferring and saving of data during the experiment. Digital data from an encoder and GPS were transmitted to the camera through the terminal block shown in Figure 2 2 ( A ) Figure 2 2 ( B ) shows schematic diagram of input and output data connection of cameras. The encoder was used as an external triggering device which help ed avoid overlapped area s between images. The Camera DGPS Receiver Encoder USB Memory Drive VGA Display

PAGE 24

24 type of encoder u sed here was an incremental rotary encoder which generate d digital pulses (0 and 1) when its shaft rotate d The encoder was attached to a small tire located at the back of a truck where the camera was installed ( Figure 2 3 ). The images were acquired every 0.9 m since the shortest horizontal field of view of an image was approximately 0.9 m. When image acquisi tion was triggered, a GGA sentence from a DGPS receiver was saved for time and position where an image was taken. Images and GPS information were saved in an USB memory drive s inserted in the camera s which were used for creating a geo referenced fruit drop map. Communication between the camera and the receiver was through industrial RS 232 serial port. Figure 2 3 Encoder attached on a small tire on the back side of truck. Photo courtesy of author, Daeun Choi.

PAGE 25

25 Figure 2 4 Final hardware setup. A DGPS receiver and two cameras attached to the mounting frame. Red circle: camera with metal shield, purple circle: terminal block and light blue circle: DGPS receiver. Photo courtesy of author, Daeun Choi. Figure 2 4 shows final hardware setup used in the experiments. Two cameras were installed on both side of the truck at 0.5 m above from the ground. Also, a DGPS receiver was installed along with the camera. Image Acquisition Software The camera ha d a micro processor and a real time operating system was installed. An executable software for image acquisition was developed using LabVIEW 2012 (National Instrument Corporation, Austin TX). This image acquisition software developed as a graphical user interface (GUI) shown in Figure 2 5 had three main purpose s: displaying an image, indicating digital input from the encoder, and displaying position information from the DGPS receiver. I f digital pulse was detected, a green status light illuminate d to inform the user. In an input box the number of pulses from the encoder was displayed. If the number of pulses reach ed the specified number, the Camera with metal shield Camera with metal shield

PAGE 26

26 software trigger ed the camera to ac quire an image. Also, baud rate was able to be adjusted in the software for RS 232 communication to DGPS receiver P osition information sent from the DGPS receiver was displayed in front panel Figure 2 5 F r ont panel of image a cquisition software. This software includes image display, input status and GPS coordinate indicator Field Experiment The detection and counting algorithm of citrus fruit was developed using i mages from several field experiments in Dud a & Sons grove (Immokalee, FL) and Lykes Bros. Inc. grove (Ft. Basinger, FL). The average row spacing for three citrus groves was 7.3 m and tree spacing was 4.6 m. Since dropped citrus fruit was an object of interest in the experiments, images covered wide area of the ground, especially under the canopy. The clearance between the ground and the lowest canopy for hand harvested rows at the Duda grove was less than 30 cm. While this would not be a p roblem for hand harvesting, it turned to be troublesome for image acquisition with prototype system developed in this study For the mechanical harvested rows at the Lykes grove, the

PAGE 27

27 canopy was skirted in order to make it accessible by a mechanical harvest er and so the lowest canopy was about 46 cm from the ground (Figure 2 6 ) Figure 2 6 View of skirted canopy in commercial citrus grove for mechanical harvesting and d ropped fruit due to the disease and CMNP Photo courtesy of author, Daeun Choi. Field experiments were conducted in various si tuations during hand and mechanical harvesting. For the hand harvesting, multiple crews picked up citrus fruits on the trees manually. Each crew had a bag for the harvested citrus and afte r filling up the bag, the crews transferred fruits to the tub shown in Figure 2 7 The harvested fruit in the tubs were then collected by goat truck shown in Figure 2 8 Figure 2 7 Tub for Hand Harvesting Photo courtesy of author, Daeun Choi.

PAGE 28

28 Figure 2 8 Goat truck Photo courtesy of author, Daeun Choi. Machine Vision Algorithm After acquiring images, machine vision algorithm largely consisted of five steps: image normalization, classification, texture analysis, fitting circles, estimate fruit count and mass. Image normalization was applied in order to remove the effect of varying illumination conditions. Then a logistic regression classifier was used in classification. Also, entropy values were calculated to remove incorrectly detected background from the classification. After detection, circle was fitted to measure diameter and position for estimating mass of dropped fruit in images. Figure 2 9 Flow chart of machine vision algorithm

PAGE 29

29 Image Used in Algorithm A total of 1470 images w as used in the experiment was 480 by 640. A pixel located in the first to p and left can be expressed as I ( 1 1 ) where I means the image matrix. The resolution represent s the quality of an image. Figure 2 10 shows a comparison of high resolution and low resolution images. Figure 2 10 ( A ) shows an image of low resolution while Figure 2 10 ( B ) has high resolution which has more clear v iew of objects with more details. A B Figure 2 10 Image quality comparison. A ) 640 480. B ) 3648 2736 Photo courtesy of author, Daeun Choi. Th e resolution of image was chosen to be 480 640 to allow faster processing time Although low resolution images process ed images faster, it require d the algorithm to be more precise since objects in the images ha d less pixels than in the high quality images. RGB Color Space and Intensity After acquiring images, images were saved in a USB drive in BMP format. The images were represented in regular RGB c olor space which is one of three color channel representation models The R channel means red image, G means green image and B means Blue color image. Each channel has an 8 bit dept h value which

PAGE 30

30 ranges from 0 to 255 (= 2 8 ). A f ull color image refers to a composite image of the red, green and blue image which has depth of 24 bits through the combination of 8 bit 3 channel values. All values of R, G, B channel can be represented in a no rmalized format in range of [0, 1] for the convenience (Equation 2 1). Among original RGB images, 10% of images were randomly chosen to be a training set and the rest of the images w as assign ed as a validation set. The training set was used to collect the color and size information of objects. To obtain RGB co lor information for each object the image pixels were classified to four classes based on the obje cts found in the training set. Four classes were 1. citrus, 2. leaf, 3. dead leaf, twig and tree, and 4. soil. Each object in the training set w as manually cropped and analyzed T he color information was analyzed in the RGB color space and converted into o ther color spaces such as : 1) hue, saturation, and value (HSV), 2) luminance, blue difference chroma, and red difference chroma (YCbCr), 3) lightness, a and b color opponent dimensions (Lab), 4) luma and chrominances (YIQ). The conversion formulation are i llustrated in later part of this chapter. Figure 2 11 shows histograms of R, G and B components in RGB color space. In Figure 2 11 ( A ) the cit rus class had R component values ranging from 50 to 255. Although the average of R component for a Leaf, d ead lea f, twig, tree and soil was relatively lower than citrus, the entire area of citrus ranging from 50 to 255 was overlapped with other classes. (2 1)

PAGE 31

31 Figure 2 11 Histogram of R, G and B componen t (A, B and C respectively) : citrus, leaf, dead leaf, twi g, tree and soil. Each class is overlapped and share similar color value

PAGE 32

32 Likewies, the citrus class was placed in mid range overlapping most of the area with other classes in the histogram of G component in Figure 2 11 ( B ) In B compone nt in Figure 2 11 ( C ) all classes had lower pixel values, and overlap ped each other. This overlapping area means all the classes share d similar color values with each other. One of the main reason of wide overlapping area among the classes was the differe nt illumination conditions of the images. The images we re taken in outdoor condition in the citrus grove under direct sun l ight. I llumination was not controllable which caused the most critical problem for image process ing. In this study, illumination was defined using intensity which refers to the amount of the light in a pixel. The intensity can be defined in various forms, but here, intensity was calculated by Equation 3 2 ( ITU, 1995) In Equation 2 2, R, G, and B represent the values of each component in the RGB color space. The definition in Equation 3 2 is the same definition as the Y component value in both YIQ and YCbCr co lor spaces. Using the intensity an average illumination of the image was calculated by a n average value of the intensity. The illumination condition in the training set was divided into three classes based on illumination conditions: low, mid and high illumination. The intensity values ranged 0 115, 115 164, and 164 255 for the low, mid and h igh illumination classes, respectively. The varying illumination cause d dramatic change in color values in the images. For example, the citrus in high illumination conditions ha d the color similar to yellow and the citrus under low illumination ha d a brow n color. (2 2)

PAGE 33

33 Figure 2 12 Color variation in R G and B components (A, B and C, respectively) according to different average intensity value s

PAGE 34

34 Figure 2 12 ( A ) and ( B ) display the color variation in the citrus class according to their illumination conditions. Depending on the illumination, the range of each component value varied. Especially in G and B components in Figure 2 12 ( B ) and ( C ) demonstrate the signigicant differences in color among the low, mid and high illumination classes. In Figure 2 12, the 4 classes : citrus, leaf, dead leaf and soil were all overlapped by other objects in the histogram. On top of that, the different illumin ation conditions of the images increase d the range of overlapping which made even more difficult to identify citrus fruit from the background These extended overlapp ed areas in histogram add ed difficulties in classification of objects Normalization of il lumination Additionally, the illumination condition change d significantly even within an image. This was because the images cover ed a wide range of area (0.9 m horizontally and 2.1 m vertically) whic h included the ground under the canopy. Figure 2 13 shows one of example image s u sed in the experiment s The ground ha d shadows in some areas, which ma de the color of objects darker. In contrast, the area without shadow result ed in the soil having an excessive amount of white color due to the high intensity. Thi s varying condition made segmentation complex because multiple segmentation model s were required according to illumination level s Although different segmentation method s could be applied dep ending on the illumination level s it was not easy to make correct classification because the citrus pixels usually contain ed a higher intensity value than other pixels such as soil and twig. In other words, the citrus in the mid dle illumination condition had a higher intensity value than soil or twig pixels in hi gh illumination condition.

PAGE 35

35 Figure 2 13 Example image of varying illumination. Under the shadow objects were likely to have darker color and without the shadow, the p ixels had high intensity Therefore, a process for removing the effect of illumination condition was required In this study, normalization of illumination was proposed to relieve drastic change s in intensity l evel. Normalization of illumination is defined as Equation 2 3. wh ere By Equation 2 3, the eff ect of different illumination was removed and normalized. Dividing each co mponent value by intensity the coefficient for the R, G and B component values per unit intensity was obtained. Multiplying 255 made each class more distinguishable from other classes by increasing color difference. In Figure 2 13, the citrus fruit had yellow color in the high intensity area and dark orange color in the shadow area. (2 3)

PAGE 36

36 Figure 2 14 Normalization of illumination. Citrus fruit under different illumination conditions has simi lar color value compared to Figure 2 13 However, in Figure 2 14 the intensity level throughout the image bec ame approximately uniform so the citrus frui ts ha d identical yellow color rather than different colors depending on the illumination. This process help ed citrus fruit to be segmented more easily from other background objects. Table 2 1 shows RGB color values of each object in normalized image. By no rmalization, citrus had distinctive value (255, 255, 0) from other objects such as green, cyan and white color. Table 2 1 RGB values of citrus, leaf, dead leaf and soil in normalized image Citrus (Yellow) High intensity leaf (Green) Leaf and soil, dead leaf (Cyan) Highly saturated area (White) R 255 0 0 255 G 255 255 255 255 B 0 0 255 255 Conversion to H S V and Y Cb Cr Color Space N ormalized images were converted into Hue, Saturation and Value (HSV) color space and Luminance, blue difference and red difference Chroma components (YCbCr)

PAGE 37

37 color space. For a regular RGB color image, the conversion formula is defined in Equation 2 4, 2 5, 2 6 (Gonzalez, 2008) where, To convert RGB color space to HSV color space, normalized RGB values ranging [0, 1] were used. Note that normalized RGB value and normalization of tion 2 7, 8 and 9. where [0, 1]. Also, RGB to YCbCr color space conversion formula is defined as Equation 2 10, 11, and 12 ( ITU 1995) (2 4) (2 5) (2 6) (2 7) (2 8) (2 9)

PAGE 38

38 By and 12, s were color space can range from 16 to 255. Figure 2 15 and Figure 2 16 compone nts. Unlike regular RGB images, images with the normalization of illumination had a similar intensity so that each class ha d less overlapping area s In Figure 2 15 the d distinctive variation between citrus and ba ckground objects including leaf, dead leaf, twig, soil and trees. also another way of defining the amount of the light. Each of classes was processed by normalization of illumination so the classes had the similar H istogram s i n Figure 2 16 value repr esents intensity which was defined in Equation 2 2. The intensity was normalized in the previous section, therefore, all classes ha d similar intensity values ( Figure 2 16 ( A ) Figure 2 16 ( B ) ), the high intensity area (high ly saturated area) was distinguishable among other classes. In Figure 2 16 ( C ) the citrus class is distinguishable from tree, tw ig, dead leaf and soil classes. (2 10) (2 11) (2 12)

PAGE 39

39 Figure 2 15 Histogram respectively ).

PAGE 40

40 Figure 2 16 Histogram of and components (A, B and C respectively).

PAGE 41

41 information were able to se parate each class from othe rs. Therefore, 3 color components information was chosen to compose the training information for a logistic regression classifier. Classification The col or information demonstrated in previous section was used for train ing a classifier. This type of cl assifier is called memory based classifier (Atkeson et al 1997). There are sev eral kinds of well known memory based classifier such as K nearest neighborhood (KNN), Nave Bayesian (Simple Bayesian) and a logistic reg ression classifier (Deng, 1998). KNN classifier is known for overall good performance. However, it may yield a low accuracy near boundaries of classes since the classifier does not discriminate the different class near boundaries. Also, another reason for having a low accuracy is that the classifier is likely to be affected by data density in the boundary area. Also, a nave Bayesian classifier tends to have low accurate result since it is based on a strong assumption that data is a Gaussian distribution which is not suitable for most cas es. In the algorithm, a logistic regression classifier was used for separating citrus from background. A logistic regression classifier is straigh tforward and simple and it has a good ability for extrapolation. Figure 2 7 shows a simple model for lo gistic regression classification The classifier gives classes (0 and 1) to input data based on the probability to belong to one of its categories. The probability is calculated by a function shown as solid grey line in Figure 2 17 To estimate the function the logistic model in Equation 2 13 was used.

PAGE 42

42 Figure 2 17 Simple logistic regression classifier model (Deng, 1998). where, x is an input data vector and w is a coefficient vector The coefficient of the function w=[w 1, w 2, w 3 ] and w 0 in Equation 2 13 was computed by fitting color values obtained from training data set in vector x T he vector x was constituted as in Equation 2 15. The estimated function was used to classify the pixels in the images. Figure 2 18 ( A ) shows an or iginal image. T he logistic regression classifier processed each pixel in the image and the output value from the classificat ion was assigned in every pixel. In other words, the probability that the pixel belongs to the citrus class was given in each pixel. T he confidence level was set to 0.95 so that the classifier assigned citrus class if the pixel has the probability equal or greater than 0.95. Other pixels which ha d the probability less than 0.95 were classified as background objects. (2 13) (2 14) where component value, component value, component value. (2 15)

PAGE 43

43 A B Figure 2 18 Example of classification result. A ) Original Image B ) Binary image a fter the classification using logistic regression Figure 2 18 ( B ) shows the result image after the logistic classification and most of the citrus pixels were classified as citrus class compared to the original image.

PAGE 44

44 However, not only citrus pixels were detected but also p ixels of leaf, tree and twig which had high inte nsity were also classified as citrus. Citrus Classification Using Entropy Texture Analysis U sing entropy texture analysis, only citrus was extracted. Entropy of a pixel in the image represent ed randomness of the pixel with its neighborhood. For example, i f the image was pure black color w ith no contrasting objects, every pixel ha d very low entropy. However, if there were white objects in the black background which have heavy contrast, boundary pixels of the objects would have high entropy value s In Figure 2 18 ( B ) the citrus had solid boundaries so it ha d great contrast to the background. However, noise pixels were scattered so its texture did not contain significant contrast. With this features, the entropy filter was applied to analyze the texture of objects. x 16. The entropy value of every pixel in the images w as calculated using the Equation 2 1 6. The result image is shown in Figure 2 19 which visualized the entropy values. The brighter pixel had higher entropy while darker pixels had low entropy. After applying filter, the image had only pixels which had the entropy value equal or greater than 0.95. Figure 2 20 shows result after the filter and morphological operations such as filling holes inside of citrus pixels and remove small noise objects. (2 16) where is i th neighborhood of and

PAGE 45

45 Figure 2 19 Visualization of entropy value. Brighter pixel represents the greater value. Figure 2 20 Morphological operations after entropy filter. Filling holes and remov ing small objects. Counting and Mass Estimation After detecting the citrus fruit, count and mass were estimated u sing least square circle fitting. Consequently the number of fitted circle became the count of detected citrus fruit. The c ircle fitting metho d provided position and diameter of detected citrus which were i mportant information to estimate the mass. The least square circle fitting was based on a circle equation s hown in Equation 2 17.

PAGE 46

46 With the coordinate of boundary pixels of detected citrus, the coefficients ( and ) which minimized the norm of the Equation 2 17 were estimated. In the algorithm, 2 norm was used to estimate the coefficients and it is defined as in Equation 2 18. Conversion to the mass of detected citrus involved a couple of steps. Firstly, the position and diameter of detected citrus frui t that were calculated from least square circle fitting were converted into actual size of citrus. In images, objects were bigger in close distance while further objects were smaller. For accurate estimation of the actual s ize, the position was used as well as the dia meter. Bilinear relationship between the actual size of citrus, the position and the diameter in images were modeled from a calibration data set. In Figure 2 21 ( B ) three different size s (6. 4, 7.6 and 10.2 cm) of the Styrofoam ball were measured their dia meter in pixel according to their position in the image. The camera angle and height were fixed at the same angle and height with the field experiment. A c onversion table was created to find its actual size using the diameter in pixel and position in the image. (2 17) where, : coordinate in column axis of boundary pixels of detected fruit, : coordinat e in row axis of boundary pixels of detected fruit. (2 18)

PAGE 47

47 A B Figure 2 21 Calibration images. A) M easuring the relationship between the diameter a nd the mass of actual fruit. B) M easuring the diameter in pixel count according to the siz e (6.4, 7.6 and 10.2 cm) For the second step, the estimated actual size of the citrus was converted into the mass This step involved another calibration data set between the diameter and the mass of the citrus. Figure 2 21 shows the citrus that were used in mass and diameter measurements. A total of 43 citrus was measured its mass and diameter. Two kinds of diameter were measured: major (long) and minor (short). Then, the average of the two diameters was calculated. T he relationship between mass and the av erage diameter was estimated by second order polynomial curve fitting. To sum up, the number of circles were considered as fruit count a fter fitting circles on detected citrus. The diameter and the position were converted to the actual size of citrus. Then the actual size of citrus was converted to mass of fruit in the images.

PAGE 48

48 CHAPTER 3 RESULT AND DISCUSSION Firstly, result for measurement of the actual diameter of citrus and the diameter in images are shown in Table 3 1. T hree different size s of balls (6.4, 7.6, and 10.2 cm) were measured its diameter in images according to its position (row number) in images. Table 3 1 Measurement of diameter in number of pixel s for three different sizes of white balls Actual size (cm) Row number In the image Radius (pixel ) Diameter (pixel ) 6.4 223 15.5 30.9 6.4 95 9.9 19.8 6.4 167 12.6 25.1 6.4 73 8.6 17.2 6.4 124 11.7 23.4 7.6 218 18.5 36.9 7.6 161 14.5 29.0 7.6 94 10.6 21.2 7.6 124 12.3 24.7 7.6 73 9.7 19.3 10.2 212 24.9 49.9 10.2 158 19.9 39.7 10.2 91 14.1 28.2 10.2 121 16.6 33.2 10.2 71 12.1 24. 2 T he row number ranged from 1 th (the farthest row from the camera) to 480 th (the closest row to the camera). At s ame distance from the camera, citrus size in images increased linearly as the size of ball increased. Additionally, for the same size of balls the diameter of citrus in images had linear decreasing relationship with the distance from the camera. Based on the measurement in Table 3 1 linear interpolation a nd extrapolation were applied to estimate the bilinear relationship between the diameter and position in images and the actual size of the citrus fruit Table 3 2 shows a part of

PAGE 49

49 conversion table which were created to find actual size of the detected citru s. With the table, i f the diameter and t he position were specified, the actual size of diameter in cm was calculated. For example, if the detected citrus was located in 160 th row and diameter in an image was 29.0 pixels then the actual size of the detecte d fruit would be equal to 7.6 cm. Table 3 2 2D l oo k up table (bi linear) for estimating actual size of detected fruit Size/ Row No. 6.4 cm 7.6 cm 10.2 cm 40 12.9 16.2 17.5 80 18.1 20.0 25.6 120 22.9 24.3 32.7 160 25.1 29.0 39.7 200 29.2 34.6 46.8 240 33.2 40.1 54.0 280 37.3 45.7 61.1 320 41.3 51.3 68.3 360 45.4 56.9 75.4 400 49.4 62.5 82.5 440 53.5 68.1 89.7 480 57.5 73.7 96.8 After estimating the actual size of fruit, the mass was estimated. Table 3 3 shows measured average diameter and mass of 43 citrus in a calibration set and its estimated mass. The estimated mass was calculated by regression E quation 3 1. Table 3 3 Measured diameter (cm), actual mass (g) and estimated value Av erage Diameter (cm) Actual Mass (g) Estimated Mass (g) 5.53 152 117.8 5.97 121 130.9 6.12 121 137.1 6.14 131 138.0 6.23 135 142.2 6.25 140 143.2 6.30 142 145.8 6.38 146 149.6 6.38 141 149.6 6.50 155 156.5

PAGE 50

50 Table 3 3 Continued. Average Diameter (cm) Actual Mass (g) Estimated Mass (g) 6.75 165 172.2 6.75 173 172.2 6.75 186 172.2 6.88 170 181.0 6.88 186 181.0 6.88 187 181.0 6.88 173 181.0 6.88 175 181.0 7.13 187 200.2 7.13 200 200.2 7.13 202 200.2 7.25 203 210.8 7.25 229 210.8 7.25 218 210.8 7.38 229 221.9 7.50 243 233.6 7.50 230 233.6 7.50 245 233.6 7.63 241 245.8 7.63 240 245.8 7.63 258 245.8 7.75 248 258.7 7.75 264 258.7 7.75 254 258.7 7.75 264 258.7 8.00 287 286.2 8.00 300 286.2 8.13 310 300.9 8.13 315 300.9 8.38 316 331.9 8.50 350 348.3 8.75 363 382.9 8.75 385 382.9 Equation 3 1 shows a relationship between actual mass and average of actual diameter of citrus The relationsip was estimated by second order curve fitting. Figure 3 1 shows the graph of the fitted curve and data points from Table 3 3 S cattered dots represent s the actu al fruit diameter and the mass and solid b lue line show s the f itted (3 1) where, d is the average fruit diameter.

PAGE 51

51 curve For the curve fitting, the coefficient of determination( R 2 ) was 0.94 between the average diameter and the mass. Figure 3 1 R egression of the size and the mass of actual citrus measurements Figure 3 2 shows an example of fruit count and mass estimation. Each detected citrus was marked as a red circle in Figure 3 2 ( B ) A t otal number of citrus was eight in this image because the algorithm was programmed to ignore unhealthy or defected citrus fruit. The estim ated mass was 1530.5 g. The performance of the algorithm was analyzed by two criteria accuracy and false positives Firstly, the accuracy which represents an ability of the algorithm to detect citrus fruit without any missed fruit was analyzed by comparing actual citrus count by manual counting and citrus count which were correctly identified by the machine vision algorithm. M issed citrus fruit which represent s object existing in the image but failed to be counted by the algorithm were counted as well 100 150 200 250 300 350 400 5.5 6 6.5 7 7.5 8 8.5 9 Mass (g) Diameter (cm) Actual mass Estimated mass

PAGE 52

52 Figure 3 2 Final result example. A) O riginal image B ) F inal result. The number of count: 8, the mass: 1530.5 g The accuracy from seven trials in skirted canopy in Lykes grove is shown in Table 3 4 A total of 1470 images w as collected and 1352 images were used as the validation set. Trial 1 had the most number of actual fruit drop which was 165 0 citrus while trial 3 ha d the least actual fruit drop which was 430 The mean of the actual frui t drop for seven trials was 1019 The accuracy is shown in fifth and sixth columns. The highest accuracy was in trial 6 which was 89. 5 percent and the miss ed fruit were 10.5 percent. This was because the images in trial 6 were clear er so it had a great contrast

PAGE 53

53 compared to the imag es in other trials. However, trial 2 had the least accuracy which was 59.3 percent. The source of error was from that the images were dark and unclear which contained less color variation between citrus and background. Also, the citrus in the trial 2 were located furt her than in other trial s so some of the citrus was too s mall to be detected. The average accuracy in skirted canopy was 74.8 percent. Table 3 4 Performance of algorithm for a ccuracy and estimated mass ( skirted canopy, Lykes grove) Fruit Count by Manual Counting Correctly Identified Fruit by Algorithm Missed Fruit Correctly Identified Fruit by Algorithm (%) Missed Fruit (%) Estimated Mass (kg) Trial1 1650 1322 328 80.1 19.9 316.5 Trial2 1448 859 589 59.3 40.7 282.3 Trial3 430 330 100 76.7 23.3 142.0 Trial4 1102 784 318 71.1 28.9 179.0 Trial5 885 707 178 79.9 20.1 201.5 Trial6 618 553 65 89.5 10.5 157.2 Trial7 999 782 217 78.3 21.7 224.8 Sum 7132 5337 1795 1503.2 Mean 1018.9 762.4 256.4 74.8 25.2 214.7 For the mass estimation, trial 1 had the highest mass which was 316.4 kg and trial 3 ha d the least mass which was 141.9 kg. This result correspond ed to the drop count s which were also the highest value in trial1 and the least value in trial 3. However, the ma ss in trial 4 was estimated less than trial 5 although the citrus count by the algorithm was higher in trial 4. This error was from a smaller size of the fruit in images The images in trial 4 were taken further than in trial 5 so that the mass of the citr us was underestimated Another performance analysis was for false positives The f alse positive object is incorrectly detected background objects as citrus. False positive error was calculated by

PAGE 54

54 dividing false positive count by citrus fruit count by algorithm Fourth and fifth column s in Table 3 5 show false positive count and its percentage. Most of false positive s were from soil and leaf pixels in high illumination area s These pixels had bright and yellowish color s which was similar with citrus. The highest error was in trial 3 b ecause the images had unclear distinction between unhealthy and healthy citrus in dark areas While the algorithm was counted as healthy fruit, manual counting considered them as unhealthy fruit. Table 3 5 Performance a nalysis for false positives ( skirted canopy, Lykes grove) Citrus Fruit Count by A lgorithm Correctly I dentified F ruit by A lgorithm False Positive C ount False Positive (%) Trial1 1466 1322 144 9.8 Trial2 881 859 22 2.5 Trial3 473 330 143 30.2 Trial4 815 784 31 3.8 Trial5 766 707 59 7.7 Trial6 652 553 99 15.2 Trial7 932 782 150 16.1 Sum 5985 5337 648 Mean 855 762.4 92.6 10.8 Also pe rformance of the algorithm for un skirted canopy images from Duda grove was analyzed. Accuracy and estimated mass are shown in Table 3 6 Th e accuracy was 76.4 percent in un skirted canopy which was similar wi th the result from skirted canopy Table 3 6 Performance analysis for accuracy a nd estimated mass ( un skirted canopy, Duda grove) Number of Fruit by Manual Counting Correctly Identified Fruit by Algorithm Missed Fruit Correctly Identified Fruit by Algorithm (%) Missed Fruit (%) Estimated Mass (kg) 3580 2735 845 76.4 23.6 835.7

PAGE 55

55 The false positives from un skirted canopy shown in Table 3 7 was relatively higher than in Lykes grove (average 10.8 percent). This is because many images included more occluded citrus fruit from un skirted canopy Table 3 7 Performance analysis for ability to avoid false positive error ( un skirted canopy, Duda grove) Number of F ruit Counted by A lgorithm Correctly Identified F ruit by A lgorithm False Positive C ount False Positive (%) 3294 2735 559 17.0 Th e different result might be derived if the images contain more extreme illumination conditions By the normalization of illumination in the machine vision algorithm, most of the varying illumination effect was compensated However, over exposure images or excessively dark im ages might not be normalized successfully T hese ext reme illumination conditions may lower the accuracy and cause higher false positive s Therefore, staying within certain range of illumination is important for reliable normalization of illumination Also, in the experiments, canopy shakers and catch frames were used for mechanical harvesting so the fruit drop from the mechanical harvesting was not piled up However another type of mechanical harvester, a trunk shaker cause s fruit drop to be piled in multiple layer s on the ground. In that situation the machine vision algorithm would yield low performance to detect all of the citrus, especially in a bottom layer. Therefore, different estimation process needs to be developed for the trunk shakers The se accuracy and false positives can be improved by adopting correction factor s based on conditions of grove, trees, and image s which will be studied in future work s

PAGE 56

56 Also, t he experiment r esult shows that each area in citrus groves had different fruit drop The spatial variation map of fruit drop was created using a map with DGPS coordinates obtained during image acquisition. The mass of the fruit drop was converted to mass per unit area. Area of each image was calculated by multiplying distance between locations where images were taken by average row spacing in citrus grove. The distance between images was computed from DGPS coordinates. Figure 3 3 and Figure 3 5 show full view of citrus drop ma p. Entire rows of da ta points were shown in the map s Figure 3 4 and Figure 3 6 represents zoomed view of the fruit drop map s In zoomed view, each data point was more distinguishable so that readers can recognize the variation of fruit drop. The possible r eason of the variation of each point is that each area had different spatial variability factors such as canopy size, nutrient level, soil pH, and d isease. Among those factors, spraying CMNP caused substantial fruit drops. In Duda grove, citrus drop was much higher than in Lykes grove even though the images in Dud a grove were acquired before harvesting while the images in Lykes grove were acquired after mechanical harvesti ng. The most suspected reason was that the trees in Duda grove were sprayed with CMNP several days before image acquisition so the CMNP allowed t rees to drop its fruit before harvesting. However, impact of the CMNP which was sprayed during past couple of years was not shown specificall y. Trial 6 in Lykes grove was sprayed with CMNP during past couple of years, however fruit drop count was low er com pared to other non sprayed area in past years.

PAGE 57

57 Figure 3 3 Full view of fruit drop map in Lykes grove

PAGE 58

58 Figure 3 4 Zoomed view of fruit drop map in Lykes grove.

PAGE 59

59 Figure 3 5 Full view of fruit drop map in Duda grove

PAGE 60

60 Figure 3 6 Zoomed view of fruit drop map in Duda grove

PAGE 61

61 CHAPTER 4 CONCLUSION The rugged hardware system was developed for outdoor commercial citrus grove The system included two cameras with a micro processor, an encoder, a DGPS receiver, and mounting frames. The cameras were installed on moving vehicle and triggered by the encoder to measure distance between the positions where the images were taken. This method enabled to take im ages automatically while moving with a vehicle and avoiding overlapping area s between images. Taking images on the moving vehicle signi ficantly reduced image acquisition time. After the image acquisition, DGPS coordinates were obtained to create geo referenced citrus fruit drop map The machine vision algorithm included the normalization of illumination citrus detection by a logistic regression classifier, least square circle fitting for estimating count and mass of citrus fruit drop The normalization of illumination reduced the effect of the varying illumination condition s After the normalization, the color space conversion was conducted regression classifier assigned object to the citrus class based on the similarity of colors Texture filter using entropy successfully removed background objects which were incorrectly detected as the citrus. D etected citrus fruit was fitted to a circle using least square ci rcle fitting method which calculated position and diameter of citrus With the calibration data actual size and mass of the citrus fruit was estimated. Field exp eriments were conducted in un skirted canopy in Du da & Sons grove (Immokalee, FL) and skirted canopy in Ly kes Bros. Inc. grove (Ft. Basinger, FL). The citrus fruit dropped on the ground during the harvesting season was considered in the experiments.

PAGE 62

62 The performance of the algorithm was analyzed u sing accuracy and false positives T he highest accuracy was 89 .5 percent while the lowest accuracy was 59 percent in skirted canopy and 76.4 percent in un skirted canopy For the false positive error, per centage of false positive fruit also varied between the trial set while t he highest error was 30 percent and the low est error was 2.6 percent in the skirted canopy and 17.0 percent in the un skirted canopy The performance of the algorithm can be improved by alternative imaging devices. In the experiments, two cameras were used and each camera covered entire area under canopy at a time to expedite image a cquisition and processing. C onsequently, field of view was wide so that the c itrus in the images was small. Small objects in the images caused missed fruit because it did not contain enough amount of information to be detected. This problem can be improved using higher resolution camera or using multiple cameras so the objects cont ain enough information such as color and shape Also, image acquisition time can be improved by using a video camera. Two cameras were triggered by encoder pulses in order to acquire images. However, this method increased unnecessary image acquisition time from encoder pulse detection, counting and triggering the cameras. Using a video camera and image stitching technique s image acquisition process c an be simplified. By increasing image quality and number of cameras, the machine vision algorithm can be mod ified for advanced application s Those f uture works will be i mmature citrus fruit drop detection during mechanical harvesting and e arly yield estimation Immature citrus fruit drop detection during mechanical harvesting will ascertain optimal speed of mech anical harvester so it would reduce future yield loss

PAGE 63

63 Also, modifying image acquisition system to acquire images of an entire tree will enable early e stimation of citrus production.

PAGE 64

64 LIST OF REFERENCES

PAGE 65

65

PAGE 66

66 BIOGRAPHICAL SKETCH Daeun Choi is a master student and research assistant in Precision Agriculture s in bio mechatronic engineering and e conomics from Sungkyunkwan Unive rsity, Seoul, Korea in 2011. After finishing undergraduate program s she moved to the United States to pursue higher education in University of Florida, Gainesville programs in department of agricultural and biologi cal engineering. Currently, her research is focused on developing machine vision system for agricultural application s.