Morgan Davidson
Utah State University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Morgan Davidson.
international conference on robotics and automation | 2001
Morgan Davidson; Vikas Bahl
Path tracking algorithms for wheeled mobile robots (WMRs) are frequently parametric in the sense that they are time-based. This has the potential of introducing lag-related errors, and is not a direct approach. A spatial path tracking control algorithm, the /spl epsiv/-controller (C/sub /spl epsiv//), is developed in this paper. It is based solely on static path geometry with position feedback. The C/sub /spl epsiv//, is applied in simulation to three different WMR steering configurations to illustrate the performance and generality of this new approach. Actual results are found to parallel the simulated results.
Proceedings of SPIE | 2001
Kevin L. Moore; Nicholas S. Flann; Shayne C. Rich; Monte Frandsen; You Chung Chung; Jason Martin; Morgan Davidson; Russell Maxfield; Carl G. Wood
Previous research has produced the T-series of omni- directional (ODV) robots, which are characterized by their use of smart wheel technology. In this paper we describe the design, implementation, and performance of the first use of ODV technology in a complete robotic system for a practical, real-world application. The system discussed is called ODIS, short for Omni-Directional Inspection System. ODIS is a man- portable mobile robotic system that can be used for autonomous or semi-autonomous inspection under vehicles in a parking area. The ODIS system can be deployed to travel through a parking area, systematically determining when a vehicle is in a parking stall and then carrying out a sweep under the vehicle, while sending streaming video back to a control station. ODIS uses three ODV wheels designed with a belt-driven steering mechanism to facilitate the low profile needed to fit underneath most vehicles. Its vetronics capabilities include eight different processors and a sensor array that includes a range-finding laser, sonar and IR sensors, and a color video camera. The ODIS planning and control architecture is characterized by a unique coupling between the vehicle-level path-tracking control system and a novel sensor-based feedback system for intelligent behavior generation. Real-life examples of ODISs performance show the effectiveness of the system.
international conference on robotics and automation | 2002
Matthew D. Berkemeier; Morgan Davidson; Vikas Bahl; YangQuan Chen; Lili Ma
ODIS is an omni-directional mobile robot designed to autonomously or semi-autonomously inspect automobiles in a parking lot. Periodically, its position and orientation references need to be reset. This paper considers visual servoing to parking lot lines as one possible approach. Analysis and simulations demonstrate that a surprisingly simple proportional controller in the image coordinates can accomplish position and orientation alignment with parking lot lines. Unlike previous work, no image Jacobian matrix is necessary. Knowledge of the camera focal length is not required, but the camera and vehicle axes are assumed to be aligned, and the vehicle is assumed to rotate about the camera frames y-axis.
Proceedings of SPIE | 1999
Carl G. Wood; Morgan Davidson; Shayne C. Rich; Jared Keller; Russell Maxfield
In response to ultra-high maneuverability vehicle requirements, Utah State University (USU) has developed an autonomous vehicle with unique mobility and maneuverability capabilities. This paper describes the mechanical design of the USU T2 Omni-Directional Vehicle (ODV). The T2 vehicle is a second- generation ODV and weighs 1350 lb with six independently driven and steered wheel assemblies. The six wheel, independent steering system is capable of infinite rotation, presenting a unique solution to enhanced vehicle mobility requirements. Mechanical design of the wheel drive motors and the performance characteristics of the drive system are detailed. The steering and suspension system is discussed and the design issues associated with these systems are detailed. The vehicle system architecture, vetronics architecture, control system architecture, and kinematic-based control law development are described.
Unmanned ground vehicle technology. Conference | 2000
Morgan Davidson; Vikas Bahl; Carl G. Wood
In response to ultra-high maneuverability vehicle requirements, Utah State University (USU) has developed an autonomous vehicle with unique mobility and maneuverability capabilities. This paper describes a study of the mobility of the USU T2 Omni-Directional Vehicle (ODV). The T2 vehicle is a mid-scale (625 kg), second-generation ODV mobile robot with six independently driven and steered wheel assemblies. The six wheel, independent steering system is capable of unlimited steering rotation, presenting a unique solution to enhanced vehicle mobility requirements. This mobility study focuses on energy consumption in three basic experiments, comparing two modes of steering: Ackerman and ODV. The experiments are all performed on the same vehicle without any physical changes to the vehicle itself, providing a direct comparison these two steering methodologies. A computer simulation of the T2 mechanical and control system dynamics is described.
american control conference | 2002
Morgan Davidson; Vikas Bahl; Kevin L. Moore
A nonlinear spatial path tracking control law, called the /spl epsi/-controller (C/sub /spl epsi//), was developed for autonomous ground vehicles (AGVs) such as the USU T-series and ODIS (omni-directional inspection system) wheeled mobile robots (WMRs). It is essentially a SISO regulator operating on the normal spatial deviation, /spl epsi/, of the robot from the desired path. As our performance expectation of C/sub /spl epsi// is entirely spatial a logical choice of the regulator should avoid any reference to time. We present a spatial integrator PI (SI-PI) regulator for C/sub /spl epsi// which is devoid of any time references while its structure includes some of the robot dynamics. It is also shown that the control action taken by this regulator represents the work done in moving the robot through an error vector field. The regulator gains are designed using a validated nonlinear Simulink/spl trade//StateFlow model. The experimental results presented show the effectiveness of the proposed SI-PI regulation scheme in the path tracking control of the ODIS WMR.
IFAC Proceedings Volumes | 2002
Lili Ma; Matthew D. Berkemeier; YangQuan Chen; Morgan Davidson; Vikas Bahl; Kevin L. Moore
Abstract This paper presents a simple robot localization technique using the wireless visual servoing technique involved in an autonomous ground vehicle ODIS (omnidirectional inspection system) for under-car inspection tasks in standard parking lot environment. Based on the current architecture of ODIS and the dedicated scripting language, an iterative visual servoing scheme is proposed to align the yellow line of the parking lot. The iterative scheme tolerates the uncertain time delay due to the wireless connections without introducing stability problem due to time-varying delay in real-time visual servoing. Experimental results are presented to show that for our specific application, the wireless visual servoing technique presented in this paper is an efficient way for robot localization.
Proceedings of SPIE | 2010
Alan Bird; Scott A. Anderson; Dale C. Linne von Berg; Morgan Davidson; Niel Holt; Melvin R. Kruer; Michael L. Wilson
EyePod is a compact survey and inspection day/night imaging sensor suite for small unmanned aircraft systems (UAS). EyePod generates georeferenced image products in real-time from visible near infrared (VNIR) and long wave infrared (LWIR) imaging sensors and was developed under the ONR funded FEATHAR (Fusion, Exploitation, Algorithms, and Targeting for High-Altitude Reconnaissance) program. FEATHAR is being directed and executed by the Naval Research Laboratory (NRL) in conjunction with the Space Dynamics Laboratory (SDL) and FEATHARs goal is to develop and test new tactical sensor systems specifically designed for small manned and unmanned platforms (payload weight < 50 lbs). The EyePod suite consists of two VNIR/LWIR (day/night) gimbaled sensors that, combined, provide broad area survey and focused inspection capabilities. Each EyePod sensor pairs an HD visible EO sensor with a LWIR bolometric imager providing precision geo-referenced and fully digital EO/IR NITFS output imagery. The LWIR sensor is mounted to a patent-pending jitter-reduction stage to correct for the high-frequency motion typically found on small aircraft and unmanned systems. Details will be presented on both the wide-area and inspection EyePod sensor systems, their modes of operation, and results from recent flight demonstrations.
Unmanned ground vehicle technology. Conference | 2003
Carl G. Wood; Trent Perry; Douglas Cook; Russell Maxfield; Morgan Davidson
Through funding from the US Army-Tank-Automotive and Armaments Commands (TACOM) Intelligent Mobility Program, Utah State Universitys (USU) Center for Self-Organizing and Intelligent Systems (CSOIS) has developed the T-series of omni-directional robots based on the USU omni-directional vehicle (ODV) technology. The ODV provides independent computer control of steering and drive in a single wheel assembly. By putting multiple omni-directional (OD) wheels on a chassis, a vehicle is capable of uncoupled translational and rotational motion. Previous robots in the series, the T1, T2, T3, ODIS, ODIS-T, and ODIS-S have all used OD wheels based on electric motors. The T4 weighs approximately 1400 lbs and features a 4-wheel drive wheel configuration. Each wheel assembly consists of a hydraulic drive motor and a hydraulic steering motor. A gasoline engine is used to power both the hydraulic and electrical systems. The paper presents an overview of the mechanical design of the vehicle as well as potential uses of this technology in fielded systems.
Chemical, Biological, Radiological, Nuclear, and Explosives (CBRNE) Sensing XIX | 2018
Michael Wojcik; Alan Bird; Jason Wooden; James Q. Peterson; Morgan Davidson; Monte Frandsen
The system and mechanical design of a four-wavelength lidar system is described. The system is designed to be maximally adaptive to deployment scenario in terms of both size/weight/power and detection application. The wavelengths included in the system are 266 nm, 355 nm, 1064 nm, and 1574 nm – all generated from Nd:YAG based pump laser sources. The system is designed to have a useful range from 400 meters to 5,000 meters, depending on the wavelength and atmospheric conditions.