Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Alison A. Proctor is active.

Publication


Featured researches published by Alison A. Proctor.


Journal of Aerospace Computing Information and Communication | 2005

Vision-Aided Inertial Navigation for Flight Control

Allen D. Wu; Eric N. Johnson; Alison A. Proctor

Published in the Journal of Aerospace Computing, Information, and Communication, Vol. 2, September 2005


AIAA Guidance, Navigation, and Control Conference and Exhibit | 2004

Vision-Only Aircraft Flight Control Methods and Test Results

Alison A. Proctor; Eric N. Johnson

An unmanned aerial vehicle usually carries an array of sensors whose output is used to estimate the vehicle’s attitude, velocity and position. This paper details the development of control strategies for a glider, which is capable of flying from a starting point to a ending location using only a single vision sensor. Using vision to control an aircraft presents a few unique challenges. Firstly, absolute state measurements are not available from an image. Secondly, in order to maintain adequate control of the aircraft, the images must be processed at a fast rate. The image processor utilizes an integral image representation and a rejective cascade filter to find and classify simple features in the images, reducing the image to the most probable pixel location of the objective. The navigation algorithms use an extended Kalman filter to generate state estimates based on measurements obtained from the imagery. The algorithms are tested through the flight testing of a glider instrumented only with a single camera.


Journal of Field Robotics | 2006

Vision-only Control and Guidance for Aircraft

Alison A. Proctor; Eric N. Johnson; Thomas B. Apker

An unmanned aerial vehicle usually carries an array of sensors whose output is used to estimate vehicle attitude, velocity, and position. This paper details the development of guidance, navigation, and control strategies for a glider, which is capable of flying a terminal trajectory to a known fixed object using only a single vision sensor. Controlling an aircraft using only vision presents two unique challenges: First, absolute state measurements are not available from a single image; and second, the images must be collected and processed at a high rate to achieve the desired controller performance. The image processor utilizes an integral image representation and a rejective cascade filter to find and classify simple features in the images, reducing the image to the most probable pixel location of the destination object. Then, an extended Kalman filter uses measurements obtained from a single image to estimate the states that would otherwise be unobservable in a single image. In this research, the flights are constrained to keep the destination object in view. The approach is validated through simulation. Finally, experimental data from autonomous flights of a glider, instrumented only with a single nose-mounted camera, intercepting a target window during short low-level flights, are presented.


AIAA Guidance, Navigation, and Control Conference and Exhibit | 2005

Vision-only Approach and Landing

Alison A. Proctor; Eric N. Johnson

Landing an aircraft in a remote landing zone autonomously presents several challenges. Firstly, the exact location, orientation, and elevation of the landing zone is not always known; secondly, the accuracy of the aircrafts navigation solution is not always su‐cient for this type of precision maneuver. This paper explores a method for estimating the relative position and attitude of the aircraft to marked landing area using only the images from a single camera. The corners of the landing zone are marked with red beacons. The position of these beacons in the camera image are extracted using simple and e‐cient image processing techniques. Then the location of the corners are used as the measurements for an extended Kalman fllter, which is designed to estimate the relative position, velocity, attitude and angular rates of the aircraft. The performance of this navigation algorithm is demonstrated using simulation.


1st UAV Conference | 2002

Development of an Autonomous Aerial Reconnaissance System at Georgia Tech

Alison A. Proctor; Suresh K. Kannan; Chris Raabe; Henrik B. Christophersen; Eric N. Johnson

Presented at the Association for Unmanned Vehicle Systems International Unmanned Systems Symposium and Exhibition, Baltimore, Maryland, July, 2003.


document analysis systems | 2003

Vision-only aircraft flight control

C. De Wagter; Alison A. Proctor; Eric N. Johnson

Building aircraft with navigation and control systems that can complete flight tasks is complex, and often involves integrating information from multiple sensors to estimate the state of the vehicle. This paper describes a method, in which a glider can fly from a starting point to a predetermined and location (target) precisely using vision only. Using vision to control an aircraft represents a unique challenge, partially due to the high rate of images required in order to maintain tracking and to keep the glider on target in a moving air mass. Second, absolute distance and angle measurements to the target are not readily available when the glider does not have independent measurements of its own position. The method presented here uses an integral image representation of the video input for the analysis. The integral image, which is obtained by integrating the pixel intensities across the image, is reduced to a probable target location by performing a cascade of feature matching functions. The cascade is designed to eliminate the majority of the potential targets in a first pruning using computationally inexpensive process. Then, the more exact and computationally expensive processes are used on the few remaining candidates; thereby, dramatically decreasing the processing required per image. The navigation algorithms presented in this paper use a Kalman filter to estimate attitude and glideslope required based on measurements of the target in the image. The effectiveness of the algorithms is demonstrated through simulation of a small glider instrumented with only a simulated camera.


IEEE Transactions on Aerospace and Electronic Systems | 2005

Visual search automation for unmanned aerial vehicles

Eric N. Johnson; Alison A. Proctor; Jincheol Ha; Allen R. Tannenbaum

The design, development, and testing of an unmanned aerial vehicle (UAV) with automated capabilities is described: searching a prescribed area, identifying a specific building within that area based on a small sign located on one wall, and then identifying an opening into that building. This includes a description of the automated search system along with simulation and flight test results. Results include successful evaluation at the McKenna Military Operations in Urban Terrain flight test site.


Journal of Aerospace Computing Information and Communication | 2004

Development and Test of Highly Autonomous Unmanned Aerial Vehicles

Eric N. Johnson; Alison A. Proctor; Jincheol Ha; Allen R. Tannenbaum

Published in Journal of Aerospace Computing, Information, and Communication, Vol. 1, Issue 12, December 2004.


american control conference | 2005

Recent flight test results of active-vision control systems

Eric N. Johnson; Alison A. Proctor; Jincheol Ha; Yoko Watanabe

This tutorial session covers recent results using methods that utilize 2D and 3D imagery (e.g., from LADAR, visual, FLIR, acoustic-location) to enable aerial vehicles to autonomously detect and prosecute targets in uncertain 3D environments. This includes segmentation approaches, active contours, adaptive control, estimation theory, and optical flow. Recent flight test results utilizing a small glider and a small helicopter as well as a high-fidelity simulation of multiple airplanes are discussed.


AIAA Guidance, Navigation, and Control Conference and Exhibit | 2003

Latency Compensation in an Adaptive Flight Controller

Alison A. Proctor; Eric N. Johnson

of both methods in handling system latency is demonstrated and discussed through simulation and the flighttesting of a small autonomous helicopter, with the results showing an increase in available bandwidth over the equivalent controller without latency compensation.

Collaboration


Dive into the Alison A. Proctor's collaboration.

Top Co-Authors

Avatar

Eric N. Johnson

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jincheol Ha

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Henrik B. Christophersen

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Allen D. Wu

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Brendan Andrus

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

C. De Wagter

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

D. Mike

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Hal Gates

Georgia Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge