Adam MacDonald
Air Force Institute of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Adam MacDonald.
Optical Engineering | 2006
Adam MacDonald; Stephen C. Cain; Ernest E. Armstrong
Recent developments in staring focal plane technology have spawned significant interest in the application of gated laser radar systems to battlefield remote sensing. Such environments are characterized by rapidly changing atmospheric seeing conditions and significant image distortion caused by long slant-range paths through the most dense regions of the atmosphere. Limited weight, space, and computational resources tend to prohibit the application of adaptive optic systems to mitigate atmospheric image distortion. We demonstrate and validate the use of a fast, iterative, maximum a posteriori (MAP) estimator to estimate both the original target scene and the ensemble-averaged atmospheric optical transfer function parameterized by Frieds seeing parameter. Wide-field-of-view sensor data is simulated to emulate images collected on an experimental test range. Simulated and experimental multiframe motion-compensated average images are deconvolved by the MAP estimator to produce most likely estimates of the truth image as well as the atmospheric seeing condition. For comparison, Frieds seeing parameter is estimated from experimentally collected images using a knife-edge response technique. The MAP estimator is found to yield seeing condition estimates within approximately 6% using simulated speckle images, and within approximately 8% of knife-edge derived truth for a limited set of experimentally collected image data.
Optical Engineering | 2006
Adam MacDonald; Stephen C. Cain
Gated laser radar imaging systems hold unique promise for long-distance remote sensing applications. Short-exposure speckle imagery from wide field-of-view (FOV) systems may be used to jointly estimate maximum likelihood estimates of the remote scene together with the atmospheric seeing condition parameterized by Frieds parameter. Previous research has indicated that employment of the short-exposure optical transfer function (OTF) within the deconvolution algorithm yields slightly pessimistic estimates of Frieds parameter. It was postulated that the short-exposure OTF included excessive high spatial frequency components when applied to wide FOV systems, yielding seeing condition estimates of the imaging scenario that were slightly lower than if the atmospheric conditions admitted an anisoplanatic system imaging model. To better estimate Frieds parameter, an anisoplanatic OTF (AOTF) was developed using a tilt-only phase correlation approximation. This AOTF was used together with the short-exposure OTF within the MAP algorithm, and estimated seeing conditions were compared for both simulated and experimental wide FOV scenarios. It was found that the additional anisoplanatic blur components modeled by the AOTF increased the accuracy of the estimation of Frieds parameter from 5% to within 2% using simulated imagery, and from 8.6% to within 2.9% using experimentally collected image data.
Optical Science and Technology, the SPIE 49th Annual Meeting | 2004
Adam MacDonald; Stephen C. Cain; Ernest E. Armstrong
A new image reconstruction algorithm is used to remove the effect of atmospheric turbulence on motion-compensated frame averaged data collected by a laser illuminated 2-D imaging system. The algorithm simultaneously computes a high resolution image and Frieds seeing parameter via a MAP estimation technique. This blind deconvolution algorithm differs from other techniques in that it parameterizes the unknown component of the impulse response as an average short-exposure point spread function. The utility of the approach lies in its application to laser illuminated imaging where laser speckle and turbulence effects dominate other sources of error and the field of view of the sensor greatly exceeds the isoplanatic angle.
Optical Science and Technology, the SPIE 49th Annual Meeting | 2004
Adam MacDonald; Ernest E. Armstrong; Stephen C. Cain
Registration of individual images remains a significant problem in the generation of accurate images collected using coherent imaging systems. An investigation of the performance of eight distinct image registration algorithms was conducted using data collected from a coherent optical imaging system developed by the Air Force Research Laboratories, Sensors Division, ARFL/SNJT. A total of 400 images of three distinct scenes were collected by SRJT and made available to the Air Force Institute of Technology (AFIT) for this study. Scenery was collected at 3 and 10 kilometers of wheeled vehicles supporting resolution and uniform target boards. The algorithms under study were developed by scientists and engineers at AFRL, and had varying levels of performance in terms of image mis-registration and execution time. These eight algorithms were implemented on a general-purpose computer running the MATLAB simulation environment. The algorithms compared included: block-match, cross-correlation, cross-search, directional-search, gradient-based, hierarchical-block, three-step, and vector-block methods. It was found that the cross-correlation, gradient-based and vector-block search techniques typically had the lowest error metric. The vector-block and cross-correlation methods proved to have the fastest execution times, while not suffering significant error degradation when estimating the registration shift of the test images.
Proceedings of SPIE | 2005
Adam MacDonald; Stephen C. Cain
Considerable tactical utility is anticipated for systems that coherently illuminate remote target scenes to form detailed images over long, turbulent optical paths through wide FOV optical components. Typical viewing conditions greatly exceed the isoplanatic angle, and isoplanatic patch sizes approach the area of individual pixels on the imaging array. Although adaptive optical systems have met limited success in the restoration of anisoplanatically formed images, such hardware is unsuitable for tactical applications, and requires multiple point-source imagery to adapt the optical system to the turbulence. Our previous work demonstrated a fast, information-theoretic postprocessing algorithm that seeks to jointly maximize the likelihood of the image given a remote scene, as well as an estimate for Frieds seeing parameter to describe current atmospheric conditions. That research employed a short-exposure OTF to model the anisoplanatic system response for a series of motion-compensated images. Although results from the algorithm were encouraging, it was understood that the short-exposure OTF provided an optimistic model for the overall anisoplanatic blur function caused by turbulence. A more accurate OTF accounts for not only the global shift of each image collected in the ensemble, but also for the blur induced by random and uncorrelated shifts of each of the many isoplanatic patches collected at the imaging device. This research complements the blind deconvolution algorithm by deriving an anisoplanatic OTF (AOTF) that better models the blur function of a motion-compensated ensemble of images. Results are presented that compare the recovered images obtained using both the short-exposure OTF as well as the AOTF.
ieee aerospace conference | 2007
Stephen C. Cain; Adam MacDonald
Current efforts aimed at detecting and identifying near Earth objects (NEOs) that pose potential risks to earth use moderately sized telescopes combined with image processing algorithms to detect the asteroid motion. These algorithms detect objects via assumptions about the point-like nature of the target. This assumption breaks down in poor seeing conditions when the object no longer resembles a point source. This paper serves to document an alternative approach involving the use of many smaller apertures whose images are fused using Bayesian decision techniques that assume nothing about the shape of the target in order to determine the presence of a NEO The technique is shown to be robust in the presence of atmospheric turbulence. Simulation studies are conducted showing the feasibility of the proposed technique.
IEEE Transactions on Image Processing | 2007
Adam MacDonald; Stephen C. Cain; Mark E. Oxley
Recent interest in the collection of remote laser radar imagery has motivated novel systems that process temporally contiguous frames of collected imagery to produce an average image that reduces laser speckle, increases image SNR, decreases the deleterious effects of atmospheric distortion, and enhances image detail. This research seeks an algorithm based on Bayesian estimation theory to select those frames from an ensemble that increases spatial resolution compared to simple unweighted averaging of all frames. The resulting binary weighted motion-compensated frame average is compared to the unweighted average using simulated and experimental data collected from a fielded laser vision system. Image resolution is significantly enhanced as quantified by the estimation of the atmospheric seeing parameter through which the average image was formed
ieee aerospace conference | 2006
Adam MacDonald
Recent interest in the collection of laser radar imagery has motivated the development of automatic, accurate image registration techniques to reduce laser speckle, increase image signal to noise ratio, decrease the deleterious effects of atmospheric tip/tilt, and enhance image detail. This research seeks a new method to assign weights to each of the frames that have been previously registered using an arbitrary translational registration algorithm. Frames with poor registration are assigned low weights according to a maximum likelihood cost function, allowing such frames to be either discarded from the ensemble or re-registered using an alternate algorithm. The estimator is based on the underlying statistics of the intensity distribution of partially coherent illumination, which has found to be well modeled by the negative binomial probability mass function. Simulated and experimentally collected data is presented to support the development and performance of the misregistered frame detector
ieee aerospace conference | 2009
Adam MacDonald
Laser radar has enjoyed significant advances over the past decade. Novel sensor topologies, compact laser illuminators, and advanced signal processing have enabled the construction of low power, portable 2-D and 3-D laser vision systems. The applications of such systems range from surveillance, targeting, weapons guidance, and remote scene measurement, to target identification and atmospheric characterization. This paper serves to assemble some recent significant examples of laser radar in the context of emergent tactical applications. Strengths and limitations of competing topologies are also examined.
ieee aerospace conference | 2008
Adam MacDonald; Michael J. Shepherd
Considerable time and money are spent on the modification of fleet-support test aircraft in order to enable carriage of novel research experiments that require in-flight test and demonstration. In many cases, the cost of aircraft modification exceeds the cost of the flight test hardware and flight hours. The USAF Test Pilot School (TPS) has worked closely with the Air Force Institute of Technology (AFIT) to develop and flight test the Reconfigurable Airborne Sensor, Communication and Laser (RASCAL) pod. The pod concept will revolutionize the way that USAF TPS conducts Test Management Project training, and will enable rapid transition of cutting-edge technology under development at AFIT and national laboratories to the demanding airborne flight environment. The development and flight test of RASCAL are discussed, as are future concepts of operation expected to be conducted at the USAF TPS.