John F. Reid
John Deere
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by John F. Reid.
Computers and Electronics in Agriculture | 2000
John F. Reid; Qin Zhang; Noboru Noguchi; Monte Andre Dickson
Abstract A review of the recent research in agricultural vehicle guidance automation in North America is presented. A conceptual framework of an agricultural vehicle guidance automation system includes navigation sensors, navigation planner, vehicle motion models, and steering controllers.
Transactions of the ASABE | 1999
Lei F. Tian; John F. Reid; J. W. Hummel
A machine-vision-system-guided precision sprayer was developed and tested. The long-term objectives of this project were to develop new technologies to estimate weed density and size in real time, realize site-specific weed control, and effectively reduce herbicide application amounts for corn and soybean fields. This research integrated a real-time machine-vision sensing system with an automatic herbicide sprayer to create an intelligent sensing and spraying system. Multiple video images were used to cover the target area. To increase the accuracy, each individual spray nozzle was controlled separately. Instead of trying to identify each individual plant in the field, weed infestation zones (0.254 m × 0.34 m) were detected. The integrated system was tested to evaluate the effectiveness and performance under varying field conditions. With the current system design, and using 0.5% weed coverage as the control zone threshold, herbicide savings of 48% could be realized.
IEEE Transactions on Image Processing | 1996
Young-Chang Chang; John F. Reid
A color calibration method for correcting the variations in RGB color values caused by vision system components was developed and tested in this study. The calibration scheme concentrated on comprehensively estimating and removing the RGB errors without specifying error sources and their effects. The algorithm for color calibration was based upon the use of a standardized color chart and developed as a preprocessing tool for color image analysis. According to the theory of image formation, RGB errors in color images were categorized into multiplicative and additive errors. Multiplicative and additive errors contained various error sources-gray-level shift, a variation in amplification and quantization in camera electronics or frame grabber, the change of color temperature of illumination with time, and related factors. The RGB errors of arbitrary colors in an image were estimated from the RGB errors of standard colors contained in the image. The color calibration method also contained an algorithm for correcting the nonuniformity of illumination in the scene. The algorithm was tested under two different conditions-uniform and nonuniform illumination in the scene. The RGB errors of arbitrary colors in test images were almost completely removed after color calibration. The maximum residual error was seven gray levels under uniform illumination and 12 gray levels under nonuniform illumination. Most residual RGB errors were caused by residual nonuniformity of illumination in images, The test results showed that the developed method was effective in correcting the variations in RGB color values caused by vision system components.
IEEE Control Systems Magazine | 1987
John F. Reid; Stephen W. Searcy
The ordered structure of agricultural row crops can provide useful guidance information for tractor control. A description of research for coupling a machine vision system and a solid-state camera to derive vehicle guidance parameters for a tractor is presented. Image segmentation is enhanced by optical filtering and controlling light intensity to the image sensor. An analysis of camera location and steering errors that can be determined from the row crops is determined by simulating the geometric relationships between the crop canopy and the image plane.
Transactions of the ASABE | 1993
K. Liao; M. R. Paulsen; John F. Reid; B. Ni; E.P. Bonifacio-Maghirang
A machine vision system was developed to identify corn kernel breakage based on kernel shape profile for automated corn quality inspection. The profile of a corn kernel was sampled into a sequence of one-dimensional digital signals based on its binary image. Shape parameters were selected by analyzing the kernel profile and were sent into a machine learning algorithm to train for a shape membership function of broken versus whole kernel. This system provided successful classifications of 99% for 720 whole kernels and 96% for 720 broken flat kernels, and of 91% for 720 whole kernels and 95% for 720 broken round kernels, respectively.
Plant Disease | 1999
Irfan S. Ahmad; John F. Reid; M. R. Paulsen; J. B. Sinclair
Symptoms associated with fungal damage, viral diseases, and immature soybean (Glycine max) seeds were characterized using image processing techniques. A Red, Green, Blue (RGB) color feature-based multivariate decision model discriminated between asymptomatic and symptomatic seeds for inspection and grading. The color analysis showed distinct color differences between the asymptomatic and symptomatic seeds. A model comprising six color features including averages, minimums, and variances for RGB pixel values was developed for describing the seed symptoms. The color analysis showed that color alone did not adequately describe some of the differences among symptoms. Overall classification accuracy of 88% was achieved using a linear discriminant function with unequal priors for asymptomatic and symptomatic seeds with highest probability of occurrence. Individual classification accuracies were asymptomatic 97%, Alternaria spp. 30%, Cercospora spp. 83%, Fusarium spp. 62%, green immature seeds 91%, Phomopsis spp. 45%, soybean mosaic potyvirus (black) 81%, and soybean mosaic potyvirus (brown) 87%. The classifier performance was independent of the year the seed was sampled. The study was successful in developing a color classifier and a knowledge domain based on color for future development of intelligent automated grain grading systems.
Proceedings of the Institution of Mechanical Engineers, Part D: Journal of Automobile Engineering | 2005
Francisco Rovira-Más; Qin Zhang; John F. Reid; Jeffrey D. Will
Abstract Finding a pathway between crop rows is essential for automated guidance of some agricultural vehicles. The research reported in this paper developed a vision-based method for detecting crop rows. This method applied the Hough transform and connectivity analysis to process images of a vehicles forward view and to use them to find the appropriate pathway in the field. The Hough transform was used to detect crop rows and the connectivity analysis was applied to identify the most suitable path from all possible choices. This system was implemented in an agricultural tractor and tested in both laboratory and field experiments. The methodology devised overcame image noise problems and successfully determined the proper trajectory for the tractor.
Transactions of the ASABE | 2000
Qin Zhang; John F. Reid; D. Wu
The design of electrohydraulic (E/H) steering control systems for off-road vehicles should consider the characteristics of physical components, steering system dynamics, and a variety of other factors including vehicle dynamics and soil stiffness. A computer-controlled hardware-in-the-loop (HIL) E/H steering simulator has been developed as a tool for steering dynamics and controller research. The HIL simulator integrated a dynamic model of the steering system, an actual E/H steering actuator and control interface, and an independently controlled steering loader. This article describes the design of the HIL simulator, the development of dynamic model of a tractor E/H steering system, and the validation of this system. Results indicated that the HIL simulator was a valuable tool for investigating the dynamics and interface characteristics of E/H steering systems and evaluating real-time steering controller performance in a laboratory recreated environment for off-road vehicles with loading conditions representative of the field application. The HIL simulator provided an effective tool for E/H steering dynamics research, E/H steering system design and E/H steering controller development.
2001 Sacramento, CA July 29-August 1,2001 | 2001
Noboru Noguchi; John F. Reid; Qin Zhang; Jeffrey D. Will; Kazunobu Ishii
This study developed a field robot for an agricultural operating environment. The navigation sensor consisted of an RTK-GPS, a fiber optic gyroscope (FOG), and an inertial measurement unit (IMU). A sensor fusion algorithm was used to identify FOG bias and compensate for location error in real-time, thus providing sufficient navigation information to support accurate robot guidance in the field. The guidance system could guide the agricultural robot automatically to follow either straight or curve paths including crop rows at a speed of 2.5 m/s. This RMS position error of the desired pathway in the field was less than 3 cm. The results indicated that the navigation system was capable of guiding an agricultural robot accurately and robustly under normal agricultural operations.
Springer Handbook of Robotics, 2nd Ed. | 2016
Marcel Bergerman; John Billingsley; John F. Reid; Eldert J. van Henten
Robotics for agriculture and forestry (A&F ) represents the ultimate application of one of our society’s latest and most advanced innovations to its most ancient and important industries. Over the course of history, mechanization and automation increased crop output several orders of magnitude, enabling a geometric growth in population and an increase in quality of life across the globe. Rapid population growth and rising incomes in developing countries, however, require ever larger amounts of A&F output. This chapter addresses robotics for A&F in the form of case studies where robotics is being successfully applied to solve well-identified problems. With respect to plant crops, the focus is on the in-field or in-farm tasks necessary to guarantee a quality crop and, generally speaking, end at harvest time. In the livestock domain, the focus is on breeding and nurturing, exploiting, harvesting, and slaughtering and processing. The chapter is organized in four main sections. The first one explains the scope, in particular, what aspects of robotics for A&F are dealt with in the chapter. The second one discusses the challenges and opportunities associated with the application of robotics to A&F. The third section is the core of the chapter, presenting twenty case studies that showcase (mostly) mature applications of robotics in various agricultural and forestry domains. The case studies are not meant to be comprehensive but instead to give the reader a general overview of how robotics has been applied to A&F in the last 10 years. The fourth section concludes the chapter with a discussion on specific improvements to current technology and paths to commercialization.