Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Wayne Daley is active.

Publication


Featured researches published by Wayne Daley.


international conference on image processing | 2010

Visual tracking and segmentation using Time-of-Flight sensor

Omar Arif; Wayne Daley; Patricio A. Vela; Jochen Teizer; John M. Stewart

Time-of-Flight (TOF) sensors provide range information at each pixel in addition to intensity information. They are becoming more widely available and more affordable. This paper examines the utility of dense TOF range data for image segmentation and tracking. Energy based formulations for image segmentation are used, which consist of a data term and a smoothness term. The paper proposes novel methods to incorporate range information, obtained from the TOF sensor, into the data and the smoothness term of the energy. Graph cut is used to minimize the energy.


Optics in Agriculture, Forestry, and Biological Processing | 1995

Real-time color grading and defect detection of food products

Wayne Daley; Richard Carey; Chris Thompson

Manufacturing processes that utilize natural products as raw materials for forming their deliverables face additional challenges in the areas of quality control and inspection. This comes about from the natural variability that occurs in the products. Systems to automate this activity have been difficult to design and implement from the view of algorithm complexity which impacts the computational requirements for real-time execution. This paper will describe a technique for recognizing global or systematic defects on poultry carcasses. A method for implementing the technique that is capable of executing at a rate of about 180 birds per minute is described.


international conference on advanced intelligent mechatronics | 1999

Modeling of the natural product deboning process using biological and human models

Wayne Daley; Tian He; Kok-Meng Lee; Melissa Sandlin

One critical area in automation for commercial deboning systems for meat processing, is the inability of existing equipment to adapt to varying sizes and shapes of products. This usually results in less than desirable outcomes when measured in terms of yield of the operations, In poultry processing for example, the initial cut of wing-shoulder joints is the most critical step in the deboning process. Two approaches for determining a trajectory for the cut is presented. The first is a technique using X-ray and visual images to obtain a 2D model that locates the shoulder joint with respect to the surface features of the product. The second approach is obtained by determining a 3D cutting trajectory and the associated forces/torques using a motion analysis system and a force/torque sensor incorporated with a knife. We then discuss the potential application of these results in the design of an automated cutting system that uses the obtained trajectory as a nominal cutting path. The system would make adjustments during the cut using force feedback so as to emulate the manual cutting process.


international conference on machine learning and applications | 2010

Pre-image Problem in Manifold Learning and Dimensional Reduction Methods

Omar Arif; Patricio A. Vela; Wayne Daley

Manifold learning and dimensional reduction methods provide a low dimensional embedding for a collection of training samples. These methods are based on the eigenvalue decomposition of the kernel matrix formed using the training samples. In [2] the embedding is extended to new test samples using the Nystrom approximation method. This paper addresses the pre-image problem for these methods, which is to find the mapping back from the embedding space to the input space for new test points. The relationship of these learning methods to kernel principal component analysis [6] and the connection of the out-of-sample problem to the pre-image problem [1] is used to provide the pre-image.


31st Mechanisms and Robotics Conference, presented at - 2007 ASME International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, IDETC/CIE2007 | 2007

Endoscope geometrical analysis and kinematic control

Harvey Lipkin; Jomkwun Munnae; Gary McMurray; Debao Zhou; Wayne Daley

Endoscopes are used in medical practice to effect minimally invasive diagnostics and treatments through a natural or surgical orifice. The endoscope is a snakelike device with a two degree-of-freedom articulated tip that bends in any direction using internal cables actuated by knobs. In this paper we use a serial robot model of the tip to show that the tip motions are not decoupled with respect to the knob inputs nor do they have constant gains. Further in a geometrical analysis it is shown that the articulated tip always lies along a circle. A tip kinematic control strategy is developed based on small motions that is able to decouple the output motions from the input motions and provide a constant gain functions. This allows the surgeon to control the endoscope in an intuitive and efficient manner.Copyright


Precision agriculture and biological quality. Conference | 1999

Machine vision algorithm generation using human visual models

Wayne Daley; Theodore J. Doll; Shane W. McWhorter; Anthony A. Wasilewski

The design of robust machine vision algorithms is one of the most difficult parts of developing and integrating automated systems. Historically, most of the techniques have been developed using ad hoc methodologies. This problem is more severe in the area of natural/biological products. In this arena, it has been difficult to capture and model the natural variability to be expected in the products. This present difficulty in performing quality and process control in the meat, fruit and vegetable industries. While some systems have been introduced, they do not adequately address the wide range of needs. This paper will propose an algorithm development technique that utilizes modes of the human visual system. It will address that subset of problems that humans perform well, but have proven difficult to automate with the standard machine vision techniques. The basis of the technique evaluation will be the Georgia Tech Vision model. This approach demonstrates a high level of accuracy in its ability to solve difficult problems. This paper will present the approach, the result, and possibilities for implementation.


2015 ASABE Annual International Meeting | 2015

A Study on Quantitative Metrics for Evaluating Animal Behavior in Confined Environments

Colin Usher; Wayne Daley; A. Bruce Webster; Casey W. Ritz

Abstract. Researchers at the Georgia Tech Research Institute and the University of Georgia recently concluded an experiment studying animal reaction to robotic systems. The purpose of this study was to determine if the operation of robots in a poultry grow-out house environment is feasible from an animal behavior perspective. To determine this, an experiment was conducted operating both an aerial and a ground robot in a small-scale grow-out house housing broiler chickens for a typical growth cycle (6 weeks). Humans also interacted with the flock daily. The environment and robots were equipped with cameras and other sensors to record data for the entire duration of the experiment.As a part of this research effort, the team had to establish a set of measurable metrics with which to be able to quantify changes in animal behavior. Given the unique scenario and data collection abilities, a set of three measurable metrics with which to quantitatively assess the impact of operating the robots was created. These metrics included avoidance distance, speed when avoiding, and recovery time. Each of the newly defined metrics can be evaluated using statistics and mathematical analysis. As a result, a software program was developed to assist in the analysis of these metrics.The avoidance distance metric is defined as the average distance between the chicken and the external stimuli, whether it be humans or the robot systems, as it moves throughout the house. The flight response metric is defined as the animal response running away from external stimuli. The mass of movement, their average speed, and distance of travel can all be quantitatively characterized. Finally, the recovery time metric is defined as the average time it takes before the chicken resume “normalâ€x9d activity after the stimulus leaves the environment.A software tool was developed to assist in the analysis of these metrics using video recordings. The analysis shows that there are statistical differences in the average avoidance distance metric but there is no significant difference between the average speeds, or with the recovery time metric, indicating that operating robots in the environment is no more stressful to the chicken than the presence of a human. A study on the ability of these metrics for identifying statistically significant changes in animal behavior and what they potentially mean is presented herein.


Proceedings of SPIE | 2013

Detection of vehicle occupants in HOV lanes: exploration of image sensing for detection of vehicle occupants

Wayne Daley; Colin Usher; Omar Arif; John M. Stewart; Jack W. Wood; John Turgeson; Erin Hanson

One technique to better utilize existing roadway infrastructure is the use of HOV and HOT lanes. Technology to monitor the use of these lanes would assist managers and planners in efficient roadway operation. There are no available occupancy detection systems that perform at acceptable levels of accuracy in permanent field installations. The main goal of this research effort is to assess the possibility of determining passenger use with imaging technology. This is especially challenging because of recent changes in the glass types used by car manufacturers to reduce the solar heat load on the vehicles. We describe in this research a system to use multi-plane imaging with appropriate wavelength selection for sensing passengers in the front and rear seats of vehicles travelling in HOV/HOT lanes. The process of determining the geometric relationships needed, the choice of illumination wavelengths, and the appropriate sensors are described, taking into account driver safety considerations. The paper will also cover the design and implementation of the software for performing the window detection and people counting utilizing both image processing and machine learning techniques. The integration of the final system prototype will be described along with the performance of the system operating at a representative location.


Computer Vision Technology in the Food and Beverage Industries | 2012

Automated cutting in the food industry using computer vision

Wayne Daley; Omar Arif

Abstract: The processing of natural products has posed a significant problem to researchers and developers involved in the development of automation. The challenges have come from areas such as sensing, grasping and manipulation, as well as product-specific areas such as cutting and handling of meat products. Meat products are naturally variable and fixed automation is at its limit as far as its ability to accommodate these products. Intelligent automation systems (such as robots) are also challenged, mostly because of a lack of knowledge of the physical characteristic of the individual products. Machine vision has helped to address some of these shortcomings but underperforms in many situations. Developments in sensors, software and processing power are now offering capabilities that will help to make more of these problems tractable. In this chapter we will describe some of the developments that are underway in terms of computer vision for meat product applications, the problems they are addressing and potential future trends.


2012 Dallas, Texas, July 29 - August 1, 2012 | 2012

Development of an Audio and Video Observation and Recording Platform for Data Collection in a Broiler Growout Environment

Simeon D. Harbert; Douglas F. Britton; Wayne Daley; David V. Anderson; Guillermo Colon; Matthew Giannelli; Erin Hanson

While sensors exist for monitoring the environmental conditions within a poultry growout house, there are currently no systems that directly use bird vocalizations or behavior to indicate the health and welfare of a flock. As part of a research effort to explore this possibility, an experimental system was designed to collect audio, video, temperature, and humidity data for the entire six-week growout cycle of a flock of broiler chickens. The system consists of a Linux-based personal computer platform, two Shure condenser microphones, four video cameras, two temperature sensors, and a relative humidity sensor. A custom software application was developed to manage, record, and compress the data from the sensors, and it includes a graphic user interface where ammonia levels and other activities within the growout house could be logged manually. This paper describes in detail the design, configuration and operation of the integrated system along with considerations for data management and analysis. Overall the design has proven to be robust, and the operation relatively straight forward. Since the research is ongoing, the system continues to be improved for use in future data collection activities.

Collaboration


Dive into the Wayne Daley's collaboration.

Top Co-Authors

Avatar

Colin Usher

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Omar Arif

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

John M. Stewart

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Douglas F. Britton

Georgia Tech Research Institute

View shared research outputs
Top Co-Authors

Avatar

Erin Hanson

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Patricio A. Vela

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Richard Carey

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Wiley Holcombe

Georgia Tech Research Institute

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge