Clayton V. Stewart
George Mason University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Clayton V. Stewart.
Proceedings of the IEEE | 1993
Clayton V. Stewart; Baback Moghaddam; Kenneth J. Hintz; Leslie M. Novak
The application of fractal random process models and their related scaling parameters as features in the analysis and segmentation of clutter in high-resolution, polarimetric synthetic aperture radar (SAR) imagery is demonstrated. Specifically, the fractal dimension of natural clutter sources, such as grass and trees, is computed and used as a texture feature for a Bayesian classifier. The SAR shadows are segmented in a separate manner using the original backscatter power as a discriminant. The proposed segmentation process yields a three-class segmentation map for the scenes considered in this study (with three clutter types: shadows, trees, and grass). The difficulty of computing texture metrics in high-speckle SAR imagery is addressed. In particular, a two-step preprocessing approach consisting of polarimetric minimum speckle filtering followed by noncoherent spatial averaging is used. The relevance of the resulting segmentation maps to constant-false-alarm-rate (CFAR) radar target detection techniques is discussed. >
Pattern Recognition | 1994
Clayton V. Stewart; Yi-Chuan Lu; Victor J. Larson
Abstract Neural learning techniques, the Self-Organizing Feature Map and Learning Vector Quantization, have been applied to the automatic target recognition (ATR) problem in the presence of high range resolution radar target signatures. The database is collected by placing the targets on a rotary turntable and slowly turning them over a complete 360° azimuth while the radar signatures are collected. Our pattern recognition system is composed of a feature identifier and a classifier. A simple Euclidean distance classifier using those identified features provides a baseline of 97% mean probability of correct classification.
Characterization, Propagation, and Simulation of Sources and Backgrounds | 1991
Baback Moghaddam; Kenneth J. Hintz; Clayton V. Stewart
This paper illustrates the use of fractal geometry and fractal metrics for analysis and characterization of natural textures and clutter in IR images in the wavelength band of 2-5 micrometers . In addition to the local fractal dimension, the lacunarity of textures is also briefly investigated. The addition of lacunarity significantly improves the pattern classification performance and is an important part of a complete fractal description of natural textures. A new measurement technique, based on the statistics of a space-filling curve, is presented. Specifically, a space-filling scan of an image texture is used to estimate the fractal dimension of the corresponding intensity surface. This unique one-dimensional representation is also used for measuring local texture features such as granularity and lacunarity.
Proceedings of SPIE | 1992
Clayton V. Stewart; Victor J. Larson; James Doss Halsey
A radar scattering model was used to provide targets for a classification study of high range resolution radar signatures. Intrinsic dimensionality of these signatures was calculated using kth nearest neighbor information. Two classifier paradigms were implemented, a Gaussian classifier and a synthetic discriminant function classifier. The Gaussian correlation classifier was more robust in the presence of white Gaussian noise while the SDF approach was more robust for large angle bin size.
Image Understanding in the '90s: Building Systems that Work | 1991
Baback Moghaddam; Kenneth J. Hintz; Clayton V. Stewart
This paper describes a new method for building object models for the purpose of overlapped object recognition. The method relies on local fragments of the boundary to derive a set of autoregressive parameters that serve to detect similar boundary fragments. First a rule based algorithm which detects the occlusion of two or more objects is introduced. This algorithm makes use of aheuristic rule which take into account the number of intersection points of the boundary with a standard invariant shape and of global features (area, perimeter) to confirm the presence of occlusion. The object is then decomposed into visible parts by using first a polygonal approximation method and then the concave vertices obtained at the latter step. The decomposition algorithm prepares the input data for the description of the model and the object through the autoregressive filter method.© (1991) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.
Proceedings of SPIE | 1993
Clayton V. Stewart; Yi-Chuan Lu; Victor J. Larson
Radar target classification performance is greatly dependent on how the classifier represents the strongly angle dependent radar target signatures. This paper compares the performance of classifiers that represent radar target signatures using vector quantization (VQ) and learning vector quantization (LVQ). The classifier performance is evaluated with a set of high resolution millimeter-wave radar data from four ground vehicles (Camaro, van, pickup, and bulldozer). LVQ explicitly includes classification performance in its data representation criterion, whereas VQ only makes use of a distortion measure such as mean square distance. The classifier that uses LVQ to represent the radar data has a much higher probability of correct classification than VQ.
Proceedings of SPIE | 1992
Clayton V. Stewart; Victor J. Larson; David Zink
This paper presents analysis on the ability to classify fixed-wing aircraft based on their acoustic signatures. Since only a small amount of data was available, the paper focuses on feature extraction. We analyzed a data set for a single propellor and a single jet aircraft. Both spectral and cepstral analyses were performed on the data. Both nonparametric and parametric methods were used to estimate the power spectrum. For the propellor aircraft, the frequency ratio between spectral lines was found to be a useful feature for classification. The cepstrum of both the propellor and jet aircraft acoustic data were found to contain features related to engine rotation rates.
Proceedings of SPIE | 1992
Clayton V. Stewart; Baback Moghaddam; Kenneth J. Hintz
This paper demonstrates the application of fractal random process models and their related scaling parameters as features in the analysis and segmentation of clutter in high-resolution polarimetric synthetic aperture radar (SAR) imagery. Specifically, the fractal dimension of natural clutter sources, such as grass and trees, is computed and used as a texture feature for a Bayesian classifier. The SAR shadows are segmented in a separate manner using the original backscatter power as a discriminant. The proposed segmentation process yields a three-class segmentation map for the scenes considered in this study (with three clutter types: shadows, trees and grass). The difficulty of computing texture metrics in high-speckle SAR imagery is also addressed. In particular, a two-step preprocessing approach consisting of polarimetric minimum speckle filtering followed by non-coherent spatial averaging is used. The relevance of the resulting segmentation maps to constant-false-alarm-rate (CFAR) target detection techniques is also discussed.
Substance Identification Technologies | 1994
Clayton V. Stewart; Sherman Karp; Victor J. Larson
This paper presents a new semi-coherent quadratic eigenimage based technique for detecting stationary targets in SAR data. The new detector is different from previous work because it models the SAR signal as a multi-pixel, multi-band complex random signal with unknown spatial position and orientation. The new proposed detector handles the unknown orientation by modeling each target with a set of angle subclasses. The proposed detector has reduced complexity by using reduced rank techniques.
Substance Identification Technologies | 1994
Kuo-Chu Chang; Clayton V. Stewart
This paper gives an overview of the Bayesian network technology and a description of its potential application for data fusion. The Bayesian network technology is a set of representation techniques for encoding uncertain beliefs using probability theory and reasoning techniques for drawing inferences from such representatives. The technology has been successfully applied both to tasks of assessment under uncertainty and tasks of decision- making under uncertainty.