Frank W. Moore
University of Alaska Anchorage
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Frank W. Moore.
congress on evolutionary computation | 2005
Frank W. Moore; Patrick Marshall; Eric J. Balster
This investigation uses a genetic algorithm to optimize coefficient sets describing inverse transforms that significantly reduce mean squared error of reconstructed images. Quantization error introduced during image compression and reconstruction is one of the worst noise sources, due to the fact that information is always permanently lost during the transformation process. Our approach establishes an adaptive filtering methodology for evolving transforms that outperform discrete wavelet inverse transforms for the reconstruction of images subjected to quantization error. Inverse transforms evolved against a single training image consistently generalize to exhibit superior performance against other images from the test set.
congress on evolutionary computation | 2005
Frank W. Moore
This paper describes a genetic algorithm that evolves optimized sets of coefficients for one-dimensional signal reconstruction under lossy conditions due to quantization. Beginning with a population of mutated copies of the set of coefficients describing a standard wavelet-based inverse transform, the genetic algorithm systemically evolves a new set of coefficients that significantly reduces mean squared error (relative to the performance of the selected wavelet) for various classes of one-dimensional signals. The evolved transforms also outperform wavelets when subsequently tested against random signals from the same class
systems, man and cybernetics | 2007
Brendan Babb; Frank W. Moore
Modern fingerprint compression and reconstruction standards, such as those used by the US Federal Bureau of Investigation FBI), are based upon the 9/7 discrete wavelet transform. This paper describes how a genetic algorithm was used to evolve wavelet and scaling numbers for each level of a multiresolution analysis (MRA) transform that consistently outperforms the 9/7 wavelet for fingerprint compression and reconstruction tasks. Our evolved transforms also improve upon wavelets optimized by a genetic algorithm via the lifting scheme, and thus establish a new state-of-the-art in this important application area.
midwest symposium on circuits and systems | 2005
Brendan Babb; Frank W. Moore
This research established a methodology for using a genetic algorithm to evolve coefficients for matched forward and inverse transform pairs. Beginning with an initial population of randomly mutated copies of the coefficients representing a standard wavelet, our GA consistently evolved transforms that outperformed wavelets for image compression and reconstruction applications under conditions subject to quantization error. Transforms optimized against a single representative image also outperformed wavelets when subsequently tested against other images from our test set. The new methodology has the potential to revolutionize the signal and image processing fields.
national aerospace and electronics conference | 1997
Frank W. Moore; O.N. Garcia
Genetic programming systems typically use a fixed training population to optimize programs according to problem-specific fitness criteria. The best-of-run programs evolved by these systems frequently exhibit optimal (or near-optimal) performance in competitive survival environments explicitly represented by the training population. Unfortunately, subsequent performance of these programs is often less than optimal when situations arise that were not explicitly anticipated during program evolution. This paper describes a new methodology which promises to reduce the brittleness of best-of-run programs evolved by genetic programming systems. Instead of using a fixed set of fitness cases, the new methodology creates a new set of randomly-generated fitness cases prior to the evaluation of each generation of the evolutionary process. A genetic programming system that evolves optimized maneuvers for an extended 2D pursuer/evader problem was modified for this study. The extended 2D pursuer/evader problem is a competitive zero-sum game in which an evader attempts to escape a faster, more agile pursuer by performing specific combinations of thrusting and turning maneuvers. The pursuer uses the highly effective proportional navigation algorithm to control its trajectory towards the evader. The original genetic programming system used a fixed training set of pursuers. Each of these pursuers was uniquely identified by two parameters: the initial distance from pursuer to evader, and the angle that the velocity vector of the evader makes relative to the pursuer/evader line-of-sight at the time the pursuer is launched. The modified system implemented for this project was identical to the original system, except that it used random distances and angles to create a new set of fitness cases prior to each generation of the genetic programming run. Best-of-run programs were independently evolved using fixed and randomly-generated fitness cases. These programs were subsequently tested against a large, representative fixed population of pursuers to determine their relative effectiveness. This paper describes the implementation of both the original and modified systems, and summarizes the results of these tests.
genetic and evolutionary computation conference | 2009
Brendan Babb; Frank W. Moore; Michael R. Peterson
In this paper, we describe how an evolution strategy optimizes multiresolution analysis (MRA) transforms that outperform wavelets for satellite image compression and reconstruction under conditions subject to quantization error. At three multiresolution levels and 64:1 quantization, our best evolved transform reduces mean squared error (MSE) in reconstructed images by an average of 11.71% (0.54 dB) in comparison to the 9/7 Cohen-Daubechies-Feauveau (CDF) wavelet, while continuing to match the 9/7s compression capabilities. This result establishes a new state-of-the-art for quantized digital satellite images.
electronic commerce | 2002
Frank W. Moore
The missile countermeasures optimization problem is a complex strategy optimization problem that combines aircraft maneuvers with additional countermeasures in an attempt to survive attack from a single surface-launched, anti-aircraft missile. Classic solutions require the evading aircraft to execute specific sequences of maneuvers at precise distances from the pursuing missile and do not effectively account for uncertainty about the type and/or current state of the missile. This paper defines a new methodology for solving the missile countermeasures optimization problem under conditions of uncertainty. The resulting genetic programming system evolves programs that combine maneuvers with such countermeasures as chaff, flares, and jamming to optimize aircraft survivability. This methodology may be generalized to solve strategy optimization problems for intelligent, autonomous agents operating under conditions of uncertainty.
genetic and evolutionary computation conference | 2007
Michael R. Peterson; Gary B. Lamont; Frank W. Moore; Patrick Marshall
In recent years, wavelets have been widely applied instate-of-the-art image processing algorithms, providing efficient compression while maintaining superior image quality. However, wavelet performance may not be sufficient when extreme compression ratios are required. Defense applications often require robust transforms simultaneously minimizing bandwidth requirements and image resolution loss. Image processing algorithms take advantage of quantization to provide substantial lossy compression ratios at the expense of resolution. Recent research demonstrates that genetic algorithms (GAs) evolve filters out performing standard discrete wavelet transforms in conditions subject to high quantization error. Evolved filters must be trained using images appropriate to their intended application. We present a set offifty satellite images used to evolve image transforms appropriate for satellite and unmanned aerial vehicle (UAV) reconnaissance applications. We identify the best training and test images. Image transforms evolved using appropriate training images reduce the mean squared error (MSE) by an average of greater than 15% across the entire image set under conditions subject to high quantization error.
IEEE Transactions on Control Systems and Technology | 2002
Frank W. Moore
Establishes a methodology for minimizing the peak and/or aggregate radar cross sections (RCSs) of autonomous precision guided munitions (APGMs) as they ingress to a selected target through a radar threat environment. This research demonstrates how route planning may be combined with the simultaneous specification of aerodynamically feasible yaw and bank angles to significantly reduce APGM observability. The approach described in the paper has the potential to considerably enhance APGM effectiveness against enemy defense systems.
Proceedings of SPIE | 2009
Brendan Babb; Frank W. Moore; Michael R. Peterson
This paper describes the automatic discovery, via an Evolution Strategy with Covariance Matrix Adaptation (CMA-ES), of vectors of real-valued coefficients representing matched forward and inverse transforms that outperform the 9/7 Cohen-Daubechies-Feauveau (CDF) discrete wavelet transform (DWT) for satellite image compression and reconstruction under conditions subject to quantization error. The best transform evolved during this study reduces the mean squared error (MSE) present in reconstructed satellite images by an average of 33.78% (1.79 dB), while maintaining the average information entropy (IE) of compressed images at 99.57% in comparison to the wavelet. In addition, this evolved transform achieves 49.88% (3.00 dB) average MSE reduction when tested on 80 images from the FBI fingerprint test set, and 42.35% (2.39 dB) average MSE reduction when tested on a set of 18 digital photographs, while achieving average IE of 104.36% and 100.08%, respectively. These results indicate that our evolved transform greatly improves the quality of reconstructed images without substantial loss of compression capability over a broad range of image classes.