Richard Edmondson
Polaris Industries
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Richard Edmondson.
Independent Component Analyses, Wavelets, Unsupervised Nano-Biomimetic Sensors, and Neural Networks VI | 2008
Richard Edmondson; Michael Rodgers; Michele Ruggiero Banish
Pulse Coupled Neural Networks (PCNNs) have been shown to be of value in image processing applications, especially at identifying features of small spatial extent at low signal to noise ratio. In our use of the PCNN, every pixel in a scene feeds a neuron in a fully connected lateral neural network. Nearest neighbor neurons contribute to the output of any given neuron using weights that link the neuron and its neighborhood in both a linear and a non-linear fashion. The network is pulsed, and the output of the network at each pulse is a binary mask of neurons that are active. Pulsing drives the network to evaluate its state. The multi-dimensionality and the non-linear nature of the network make selecting weights using trial and error a non-trivial problem. It is important that the desired features of the input are identified on a predictable pulse, a problem that has yet to be sufficiently addressed by proponents of the PCNN. Our method to overcome these problems is to use a Genetic Algorithm to select the set of PCNN coefficients which will identify the pixels of interest on a predetermined pulse. This method enables PCNNs to be trained, which is a novel capability and renders the method of use for applications.
Proceedings of SPIE | 2012
Richard Edmondson; Kenneth Light; Andrew Bodenhamer; Paul M. Bosscher; Loren Wilkinson
Polaris Sensor Technologies (PST) has developed a stereo vision upgrade kit for TALON® robot systems comprised of a replacement gripper camera and a replacement mast zoom camera on the robot, and a replacement display in the Operator Control Unit (OCU). Harris Corporation has developed a haptic manipulation upgrade for TALON® robot systems comprised of a replacement arm and gripper and an OCU that provides haptic (force) feedback. PST and Harris have recently collaborated to integrate the 3D vision system with the haptic manipulation system. In multiple studies done at Fort Leonard Wood, Missouri it has been shown that 3D vision and haptics provide more intuitive perception of complicated scenery and improved robot arm control, allowing for improved mission performance and the potential for reduced time on target. This paper discusses the potential benefits of these enhancements to robotic systems used for the domestic homeland security mission.
Proceedings of SPIE | 2009
J. Larry Pezzaniti; Richard Edmondson; Justin Vaden; Bryan Hyatt; David B. Chenault; David Kingston; Vanilynmae Geulen; Scott Newell; Brad Pettijohn
In this paper, we report on the development of a 3D vision system consisting of a flat panel stereoscopic display and auto-converging stereo camera and an assessment of the systems use for robotic driving, manipulation, and surveillance operations. The 3D vision system was integrated onto a Talon Robot and Operator Control Unit (OCU) such that direct comparisons of the performance of a number of test subjects using 2D and 3D vision systems were possible. A number of representative scenarios were developed to determine which tasks benefited most from the added depth perception and to understand when the 3D vision system hindered understanding of the scene. Two tests were conducted at Fort Leonard Wood, MO with noncommissioned officers ranked Staff Sergeant and Sergeant First Class. The scenarios; the test planning, approach and protocols; the data analysis; and the resulting performance assessment of the 3D vision system are reported.
Proceedings of SPIE | 2010
Michele Ruggiero Banish; Mike Rodgers; Brian Hyatt; Richard Edmondson; David B. Chenault; Jason Heym; Paul DiNardo; Brian Gruber; John E. Johnson; Kelly K. Dobson
Uncalibrated stereo imagery experimental and analytical results are presented for path planning and navigation. An Army Research and Development Engineering Command micro-size UAV was outfitted with two commercial cameras and flown over varied landscapes. Polaris Sensor Technologies processed the data post flight with an image correspondence algorithm of their own design. Stereo disparity (depth) was computed despite a quick assembly, image blur, intensity saturation, noise and barrel distortion. No camera calibration occurred. Disparity maps were computed at a processing rate of approximately 5 seconds per frame to improve perception. Disparity edges (treeline to ground, voids and plateaus) were successfully observed and confirmed to be properly identified. Despite the success of localizing this disparity edges sensitivity to saturated pixels, lens distortion and defocus were strong enough to overwhelm more subtle features such as the contours of the trees, which should be possible to extract using this algorithm. These factors are being addressed. The stereo data is displayed on a flat panel 3D display well suited for a human machine interface in field applications. Future work will entail extraction of intelligence from acquired data and the overlay of such data on the 3D image as displayed.
Proceedings of SPIE | 2010
Richard Edmondson; Justin Vaden; Brian Hyatt; James Morris; J. Larry Pezzaniti; David B. Chenault; Joe Tchon; Tracy J. Barnidge; Seth Kaufman; Brad Pettijohn
In this paper, we report on the development of a 3D vision field upgrade kit for TALON robot consisting of a replacement flat panel stereoscopic display, and multiple stereo camera systems. An assessment of the systems use for robotic driving, manipulation, and surveillance operations was conducted. The 3D vision system was integrated onto a TALON IV Robot and Operator Control Unit (OCU) such that stock components could be electrically disconnected and removed, and upgrade components coupled directly to the mounting and electrical connections. A replacement display, replacement mast camera with zoom, auto-focus, and variable convergence, and a replacement gripper camera with fixed focus and zoom comprise the upgrade kit. The stereo mast camera allows for improved driving and situational awareness as well as scene survey. The stereo gripper camera allows for improved manipulation in typical TALON missions.
Multisensor, Multisource Information Fusion: Architectures, Algorithms, and Applications 2016 | 2016
David B. Chenault; Richard Edmondson
Polaris Sensor Technologies reports on the development of Pedestrian Automated System for Enforcement and Safety (PASES), a radar and video based system used to monitor vehicle and pedestrian traffic with the intent of improving pedestrian safety. Data is fused from a system of multiple sensors and multiple sensor modalities to identify vehicular violations of pedestrian right of way. A focus was placed on the selection of low cost COTS sensors to make the system more widely available to state and local DOTs with limited budgets. Applications include automated enforcement, adaptive traffic control, and improved intersection and crosswalk design based on high quality data available for traffic engineering. We discuss early results with high fidelity sensors, and the performance trades made in order to make the system affordable. A discussion of the system processing architecture is included which highlights the treatment of each sensor data type, and the means of combining the processed data products into state information related to traffic incidents involving vehicles and pedestrians.
Proceedings of SPIE | 2012
Richard Edmondson; David B. Chenault
Polaris Sensor Technologies has developed numerous 3D display systems using a US Army patented approach. These displays have been developed as prototypes for handheld controllers for robotic systems and closed hatch driving, and as part of a TALON robot upgrade for 3D vision, providing depth perception for the operator for improved manipulation and hazard avoidance. In this paper we discuss the prototype rugged 3D laptop computer and its applications to defense missions. The prototype 3D laptop combines full temporal and spatial resolution display with the rugged Amrel laptop computer. The display is viewed through protective passive polarized eyewear, and allows combined 2D and 3D content. Uses include robot tele-operation with live 3D video or synthetically rendered scenery, mission planning and rehearsal, enhanced 3D data interpretation, and simulation.
Proceedings of SPIE | 2012
Richard Edmondson; Todd Aycock; David B. Chenault
Polaris Sensor Technologies has developed a Stereovision Upgrade Kit for TALON robot to provide enhanced depth perception to the operator. This kit previously required the TALON Operator Control Unit to be equipped with the optional touchscreen interface to allow for operator control of the camera convergence angle adjustment. This adjustment allowed for optimal camera convergence independent of the distance from the camera to the object being viewed. Polaris has recently improved the performance of the stereo camera by implementing an Automatic Convergence algorithm in a field programmable gate array in the camera assembly. This algorithm uses scene content to automatically adjust the camera convergence angle, freeing the operator to focus on the task rather than adjustment of the vision system. The autoconvergence capability has been demonstrated on both visible zoom cameras and longwave infrared microbolometer stereo pairs.
Proceedings of SPIE | 2012
Richard Edmondson; Justin Vaden; David B. Chenault
Improved situational awareness results not only from improved performance of imaging hardware, but also when the operator and human factors are considered. Situational awareness for IR imaging systems frequently depends on the contrast available. A significant improvement in effective contrast for the operator can result when depth perception is added to the display of IR scenes. Depth perception through flat panel 3D displays are now possible due to the number of 3D displays entering the consumer market. Such displays require appropriate and human friendly stereo IR video input in order to be effective in the dynamic military environment. We report on a stereo IR camera that has been developed for integration on to an unmanned ground vehicle (UGV). The camera has auto-convergence capability that significantly reduces ill effects due to image doubling, minimizes focus-convergence mismatch, and eliminates the need for the operator to manually adjust camera properties. Discussion of the size, weight, and power requirements as well as integration onto the robot platform will be given along with description of the stand alone operation.
Proceedings of SPIE | 2010
Richard Edmondson; Justin Vaden; Brian Hyatt; James Morris; J. Larry Pezzaniti; David B. Chenault; Joe Tchon; Tracy J. Barnidge; Seth Kaufman; Brad Pettijohn
The use of tele-operated Unmanned Ground Vehicles (UGVs) for military uses has grown significantly in recent years with operations in both Iraq and Afghanistan. In both cases the safety of the Soldier or technician performing the mission is improved by the large standoff distances afforded by the use of the UGV, but the full performance capability of the robotic system is not utilized due to insufficient depth perception provided by the standard two dimensional video system, causing the operator to slow the mission to ensure the safety of the UGV given the uncertainty of the perceived scene using 2D. To address this Polaris Sensor Technologies has developed, in a series of developments funded by the Leonard Wood Institute at Ft. Leonard Wood, MO, a prototype Stereo Vision Upgrade (SVU) Kit for the Foster-Miller TALON IV robot which provides the operator with improved depth perception and situational awareness, allowing for shorter mission times and higher success rates. Because there are multiple 2D cameras being replaced by stereo camera systems in the SVU Kit, and because the needs of the camera systems for each phase of a mission vary, there are a number of tradeoffs and design choices that must be made in developing such a system for robotic tele-operation. Additionally, human factors design criteria drive optical parameters of the camera systems which must be matched to the display system being used. The problem space for such an upgrade kit will be defined, and the choices made in the development of this particular SVU Kit will be discussed.