Allen D. Wu
Georgia Institute of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Allen D. Wu.
Journal of Aerospace Computing Information and Communication | 2005
Allen D. Wu; Eric N. Johnson; Alison A. Proctor
Published in the Journal of Aerospace Computing, Information, and Communication, Vol. 2, September 2005
Journal of Field Robotics | 2013
Girish Chowdhary; Eric N. Johnson; Daniel Magree; Allen D. Wu; Andy Shein
GPS-denied closed-loop autonomous control of unstable Unmanned Aerial Vehicles (UAVs) such as rotorcraft using information from a monocular camera has been an open problem. Most proposed Vision aided Inertial Navigation Systems (V-INSs) have been too computationally intensive or do not have sufficient integrity for closed-loop flight. We provide an affirmative answer to the question of whether V-INSs can be used to sustain prolonged real-world GPS-denied flight by presenting a V-INS that is validated through autonomous flight-tests over prolonged closed-loop dynamic operation in both indoor and outdoor GPS-denied environments with two rotorcraft unmanned aircraft systems (UASs). The architecture efficiently combines visual feature information from a monocular camera with measurements from inertial sensors. Inertial measurements are used to predict frame-to-frame transition of online selected feature locations, and the difference between predicted and observed feature locations is used to bind in real-time the inertial measurement unit drift, estimate its bias, and account for initial misalignment errors. A novel algorithm to manage a library of features online is presented that can add or remove features based on a measure of relative confidence in each feature location. The resulting V-INS is sufficiently efficient and reliable to enable real-time implementation on resource-constrained aerial vehicles. The presented algorithms are validated on multiple platforms in real-world conditions: through a 16-min flight test, including an autonomous landing, of a 66 kg rotorcraft UAV operating in an unconctrolled outdoor environment without using GPS and through a Micro-UAV operating in a cluttered, unmapped, and gusty indoor environment.
AIAA Guidance, Navigation, and Control Conference and Exhibit | 2006
Eric N. Johnson; Michael A. Turbe; Allen D. Wu; Suresh K. Kannan; James C. Neidhoefer
Fixed-wing unmanned aerial vehicles (UAVs) with the ability to hover have significant potential for applications in urban or other constrained environments where the combination of fast speed, endurance, and stable hovering flight can provide strategic advantages. This paper discusses the use of dynamic inversion with neural network adaptation to provide an adaptive controller capable of transitioning a fixed-wing UAV to and from hovering flight in a nearly stationary position. This approach allows utilization of the entire low speed flight envelope even beyond stall conditions. The method is applied to the GTEdge, an 8.75 foot wing span fixed-wing aerobatic UAV which has been fully instrumented for autonomous flight. Results from actual flight test experiments of the system where the airplane transitions from high speed steady flight into a stationary hover and then back are presented.
Journal of Guidance Control and Dynamics | 2008
Eric N. Johnson; Allen D. Wu; James C. Neidhoefer; Suresh K. Kannan; Michael A. Turbe
Linear systems can be used to adequately model and control an aircraft in either ideal steady-level flight or in ideal hovering flight However, constructing a single unified system capable of adequately modeling or controlling an airplane in steady-level flight and in hovering flight, as well as during the highly nonlinear transitions between the two, requires the use of more complex systems, such as scheduled-linear, nonlinear, or stable adaptive systems. This paper discusses the use of dynamic inversion with real-time neural network adaptation as a means to provide a single adaptive controller capable of controlling a fixed-wing unmanned aircraft system in all three flight phases: steady-level flight, hovering flight, and the transitions between them. Having a single controller that can achieve and transition between steady-level and hovering flight allows utilization of the entire low-speed flight envelope, even beyond stall conditions. This method is applied to the GTEdge, an eight-foot wingspan, fixed-wing unmanned aircraft system that has been fully instrumented for autonomous flight. This paper presents data from actual flight-test experiments in which the airplane transitions from high-speed, steady-level flight into a hovering condition and then back again.
Journal of Guidance Control and Dynamics | 2012
Girish Chowdhary; D. Michael Sobers; Chintasid Pravitra; Claus Christmann; Allen D. Wu; Hiroyuki Hashimoto; Chester Ong; Roshan Kalghatgi; Eric N. Johnson
This paper describes the design and flight test of a completely self-contained autonomous indoor miniature unmanned aerial system (M-UAS). Guidance, navigation, and control algorithms are presented, enabling the M-UAS to autonomously explore cluttered indoor areas without relying on any off-board computation or external navigation aids such as Global Positioning Satellite (GPS). The system uses a scanning laser rangefinder and a streamlined simultaneous localization and mapping (SLAM) algorithm to provide a position and heading estimate, which is combined with other sensor data to form a six-degree-of-freedom inertial navigation solution. This enables an accurate estimate of the vehicle attitude, relative position, and velocity. The state information, with a selfgeneratedmap, is used to implement a frontier-based exhaustive search of an indoor environment. Improvements to existing guidance algorithms balance exploration with the need to remain within sensor range of indoor structures such that the SLAM algorithm has sufficient information to form a reliable position estimate. A dilution of the precisionmetric is developed to quantify the effect of environment geometry on the SLAMpose covariance, which is then used to update the two-dimensional position and heading in the navigation filter. Simulation and flight-test results validate the presented algorithms.
AIAA Guidance, Navigation and Control Conference and Exhibit | 2008
Allen D. Wu; Eric N. Johnson
The problems of vision-based localization and mapping are currently highly active areas of research for aerial systems. With a wealth of information available in each image, vision sensors allow vehicles to gather data about their surrounding environment in addition to inferring own-ship information. However, algorithms for processing camera images are often cumbersome for the limited computational power available onboard many unmanned aerial systems. This paper therefore investigates a method for incorporating an inertial measurement unit together with a monocular vision sensor to aid in the extraction of information from camera images, and hence reduce the computational burden for this class of platforms. Feature points are detected in each image using a Harris corner detector, and these feature measurements are statistically corresponded across each captured image using knowledge of the vehicle’s pose. The investigated methods employ an Extended Kalman Filter framework for estimation. Real-time hardware results are presented using a baseline configuration in which a manufactured target is used for generating salient feature points, and vehicle pose information is provided by a high precision motion capture system for comparison purposes.
AIAA Guidance, Navigation, and Control Conference | 2011
Girish Chowdhary; D. Michael Sobers; Chintasid Pravitra; Hans Claus Christmann; Allen D. Wu; Hiroyuki Hashimoto; Chester Ong; Roshan Kalghatgi; Eric N. Johnson
This paper describes the details of a Quadrotor miniature unmanned aerial system capable of autonomously exploring cluttered indoor areas without relying on any external navigational aids such as GPS. A streamlined Simultaneous Localization and Mapping (SLAM) algorithm is implemented onboard the vehicle to fuse information from a scanning laser range sensor, an inertial measurement unit, and an altitude sonar to provide relative position, velocity, and attitude information. This state information, with a self-generated map, is used to implement a frontier-based exhaustive search of an indoor environment. To ensure the SLAM algorithm has sucient information to form a reliable solution, the guidance algorithm ensures the vehicle approaches frontier waypoints through a path that remains within sensor range of indoor structures. Along with a detailed description of the system, simulation and hardware testing results are presented.
Proceedings of the Institution of Mechanical Engineers, Part G: Journal of Aerospace Engineering | 2014
Syed Irtiza Ali Shah; Allen D. Wu; Eric N. Johnson
In this work, a real-time vision-based algorithm has been developed and implemented on a flying robot, in order to detect and identify a light beacon in the presence of excessive colored noise and interference. Starting from very basic and simple image analysis techniques including color histograms, filtering techniques, and color space analyses, typical pixel-based characteristics or a model of the light beacon has been progressively established. It has been found that not only are various color space-based characteristics significant, but also the relationships between various channels across different color spaces are of great consequence, in a beacon detection problem, specifically referring to a blue light-emitting diode. A block-based search algorithm comprising of multiple thresholds and linear confidence level calculation has been implemented to search established model characteristics in real-time video image data. During implementation, once excessive noise was encountered during flight tests, a simple and low cost noise and interference filter was developed. This filter very effectively handled all noise encountered in real time. The proposed work was successfully implemented and utilized on GeorgiaTech’s participating aircraft for the International Aerial Robotics Competition by Association for Unmanned Vehicle Systems International for detection of a blue light-emitting diode problem. Major contributions of this work include establishing a multiple threshold search and detection algorithm based on not only various color channels but also their relationships and handling of as much as 40% noisy or interfered video data with successful practical implementation and demonstration of proposed approach.
Proceedings of the Institution of Mechanical Engineers, Part G: Journal of Aerospace Engineering | 2014
Syed Irtiza Ali Shah; Eric N. Johnson; Allen D. Wu; Yoko Watanabe
Finding the location of feature points in 3D space from 2D vision data in structured environments has been done successfully for years and has been applied effectively on industrial robots. Miniature flying robots flying in unknown environments have stringent weight, space, and security constraints. For such vehicles, it has been attempted here to reduce the number of vision sensors to a single camera. At first, feature points are detected in the image using Harris corner detector, the measurements of which are then statistically corresponded across various images, using knowledge of vehicle’s pose from onboard inertial measurement unit. First approach attempted is that of ego-motion perpendicular to camera axis and acceptable results for 3D feature point locations have been achieved. Next, except for a small region around the focus of expansion, forward translations along the camera axis have also been attempted with acceptable results, which is an improvement to the previous relevant work. The 3D location map of feature points thus obtained is utilizable for trajectory planning while ensuring collision avoidance through 3D space. Reduction of vision sensors to a single camera while utilizing minimum ego-motion space for 3D feature point location is a significant contribution of this work.
AIAA Infotech at Aerospace Conference and Exhibit 2011 | 2011
Hiroyuki Hashimoto; Allen D. Wu; Girish Chowdhary; Eric N. Johnson
This paper describes the design and development of the Georgia Tech Quadrotor (GTQ) Unmanned Aerial System (UAS). The GTQ is an autonomous quadrotor helicopter capable of exploring cluttered indoor areas without relying on external navigational aids such as GPS. It weights around 1600 grams, has a width of about 60 cm. The GTQ uses an o-theshelf quadrotor platform and is equipped with o-the-shelf avionics and sensor packages using custom software and interface electronics. Similar platforms have previously used an o-board computer to achieve laser-aided inertial navigation due to the limited onboard computational power. The GTQ on the other hand, is capable of exploring indoor areas fully autonomously using only the processing power on-board the aircraft. The GTQ achieves this by using an elaborate navigation algorithm that fuses information from a laser range sensor, an inertial measurement unit, and a sonar altitude sensor to form accurate estimates of the vehicle attitude, velocity, and position relative to indoor structures. The overall architecture and hardware that make this possible are discussed in detail.