Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where David Coombs is active.

Publication


Featured researches published by David Coombs.


International Journal of Computer Vision | 1993

Real-time binocular smooth pursuit

David Coombs; Christopher M. Brown

This article examines the problem of a moving robot tracking a moving object with its cameras, without requiring the ability to recognize the target to distinguish it from distracting surroundings. A novel aspect of the approach taken is the use of controlled camera movements to simplify the visual processing necessary to keep the cameras locked on the target. A gaze-holding system implemented on a robots binocular head demonstrates this approach. Even while the robot is moving, the cameras are able to track an object that rotates and moves in three dimensions.The central idea is that localizing attention in 3-D space makes precategorical visual processing sufficient to hold gaze. Visual fixation can help separate the target object from distracting surroundings. Converged cameras produce a horopter (surface of zero stereo disparity) in the scene. Binocular features with no disparity can be located with a simple filter, showing the objects location in the image. Similarly, an object that is being tracked is imaged near the center of the field of view, so spatially localized processing helps concentrate visual attention on the target. Instead of requiring a way to recognize the target, the system relies on active control of camera movements and binocular fixation segmentation to locate the target.


International Journal of Computer Vision | 1991

Real-time vergence control for binocular robots

Thomas J. Olson; David Coombs

In binocular systems,vergence is the process of adjusting the angle between the eyes (or cameras) so that both eyes are directed at the same world point. Its utility is most obvious for foveate systems such as the human visual system, but it is a useful strategy for nonfoveate binocular robots as well. Here, we discuss the vergence problem and outline a general approach to vergence control, consisting of a control loop driven by an algorithm that estimates the vergence error. As a case study, this approach is used to verge the eyes of the Rochester Robot in real time. Vergence error is estimated with the cepstral disparity filter. The cepstral filter is analyzed, and it is shown in this application to be equivalent to correlation with an adaptive prefilter; carrying this idea to its logical conclusion converts the cepstral filter into phase correlation. The demonstration system uses a PD controller in cascade with the error estimator. An efficient real-time implementation of the error estimator is discussed, and empirical measures of the performance of both the disparity estimator and the overall system are presented.


international conference on pattern recognition | 1996

Real-time single-workstation obstacle avoidance using only wide-field flow divergence

Theodore(Ted) Camus; David Coombs; Martin Herman; Tsai Hong Hong

A real-time robot vision system is described which uses only the divergence of the optical flow field for both steering control and collision detection. The robot has wandered about the lab at 20 cm/s for as long as 26 minutes without collision. The entire system is implemented on a single ordinary UNIX workstation without the benefit of real-time operating system support. Dense optical flow data are calculated in real-time across the entire wide-angle image. The divergence of this optical flow field is calculated everywhere and used to control steering and collision behavior. Divergence alone has proven sufficient for steering past objects and detecting imminent collision. The major contribution is the demonstration of a simple, robust, minimal system that uses flow-derived measures to control steering and speed to avoid collision in real time for extended periods. Such a system can be embedded in a general, multi-level perception/control system.


IEEE Control Systems Magazine | 1991

Cooperative gaze holding in binocular vision

David Coombs; Christopher M. Brown

Vision systems that hold their gaze on a visual target using binocular, maneuverable computer vision hardware are discussed. The benefits of gaze holding are identified, and the role of binocular cues and vergence in implementing a gaze-holding system is addressed. A combination of Smith prediction and optimal signal estimation that allows a system of several simulated interacting gaze-holding controls to act together coherently, despite delays and shared output variables, to produce zero-latency tracking of a moving target is described. It is shown how tracking the object increases its signal by improving its image quality and decreases the signal of competing objects by decreasing their image quality. The implementation of a subset of the gaze-holding capabilities (gross vergence and smooth tracking) on real-time computer vision hardware is described.<<ETX>>


Intelligent Robots and Computer Vision XI: Algorithms, Techniques, and Active Vision | 1992

'Bee-bot': using peripheral optical flow to avoid obstacles

David Coombs; Karen Roberts

The bee-bot demonstrates the ability to use low resolution motion vision over large fields of view to steer safely between obstacles. The system uses one receptive field for each of the left and right peripheral visual fields. This is implemented with a camera looking obliquely to each side of the robot. The largest optical flow in a receptive field indicates the proximity of the nearest object. The left and right proximities are easily compared to steer through the gap. Negative feedback control of steering is able to tolerate inaccuracies in this signal estimation. The low cost of such inexpensive basic navigation competence can free additional resources for attending to the environment.


computer vision and pattern recognition | 1992

Real-time smooth pursuit tracking for a moving binocular robot

David Coombs; Christopher M. Brown

The problem of a moving robot tracking a moving object with its cameras, without requiring the ability to recognize the target to distinguish it from distracting surroundings, is examined. A novel aspect of the approach taken is the use of controlled camera movements to simplify the visual processing necessary to keep the cameras locked on the target. A gaze-holding system implemented on a robots binocular head demonstrates this approach. Even while the robot is moving, the cameras are able to track an object that rotates and moves in three dimensions. The central idea is that localizing attention on 3D space makes simple precategorical visual processing sufficient to hold gaze.<<ETX>>


computer vision and pattern recognition | 1993

Centering behavior using peripheral vision

David Coombs; Karen Roberts

The ability to control egomotion using low resolution peripheral vision is crucial for enabling a small high resolution fovea to attend to features that require detailed examination. The robot described demonstrates the ability to use low resolution motion vision over large fields of view to steer between obstacles. The system uses the maximum flow observed in left and right peripheral visual fields to indicate obstacle proximity. Each peripheral field constitutes one-third of a wide-angle lens. The left and right proximates are compared to steer through the gap. Negative feedback control of steering is able to tolerate inaccuracies in the signal estimation. This interpretation of the flows is based on the assumption that the camera is translating along the gaze vector. This condition is maintained under egomotion by active gaze stabilization. Head rotation is countered by eye rotation, and gaze is returned to the heading by rapid camera movements when necessary.<<ETX>>


international symposium on intelligent control | 1990

Intelligent gaze control in binocular vision

David Coombs; Christopher M. Brown

Robotic vision systems that incorporate multiple cooperating sensors, mounted on maneuverable platforms, and their control are addressed. Assuming binocular, maneuverable (approximately anthropomorphic) computer vision hardware, the authors explore how sensorimotor control algorithms can contribute to visual tasks and behaviors. The authors discuss issues in the organization of robotic gaze stabilization, the implementation and application of vergence in binocular systems, and the use of nonvisual cues in stabilizing gaze. Stabilizing its gaze can enable an animate vision system to interpret and interact with its environment more effectively. Binocular cues and vergence contribute a precategorical segmentation of an object of interest that permits gaze stabilization to be preattentive and therefore very general. Nonvisual cues offer the potential for improving gaze stabilization performance by enabling the system to sense and compensate for its own motion, thereby allowing the visual signals to describe primarily target motion.<<ETX>>


Real-time Imaging | 1996

A real-time computer vision platform for mobile robot applications

Sandor S. Szabo; David Coombs; Martin Herman; Theodore(Ted) Camus; Hongche Liu

Abstract A portable platform is described that supports real-time computer vision applications for mobile robots. This platform includes conventional processors, an image processing front-end system, and a controller for a pan/tilt/vergence head. The platform is ruggedized to withstand vibration during off-road driving. The platform has successfully supported experiments in video stabilization and detection of moving objects for outdoor surveillance, gradient-based and correlation-based image flow estimators, and indoor mobility using divergence of flow. These applications have been able to run at rates ranging from 3 to 15 Hz for image sizes from 64 × 64 to 256 × 256.


international conference on computer vision | 1995

Real-time obstacle avoidance using central flow divergence and peripheral flow

David Coombs; Martin Herman; Tsai-Hong Hong; Marilyn Nashman

Collaboration


Dive into the David Coombs's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Martin Herman

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

Tsai Hong Hong

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

Marilyn Nashman

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

Theodore(Ted) Camus

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

Billibon Yoshimi

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

Karen Roberts

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

Karl Murphy

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

Sandor S. Szabo

National Institute of Standards and Technology

View shared research outputs
Top Co-Authors

Avatar

Gin-Shu Young

National Institute of Standards and Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge