Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Gregory L. Tarr is active.

Publication


Featured researches published by Gregory L. Tarr.


IEEE Transactions on Neural Networks | 1999

Physiologically motivated image fusion for object detection using a pulse coupled neural network

Randy P. Broussard; Steven K. Rogers; Mark E. Oxley; Gregory L. Tarr

This paper presents the first physiologically motivated pulse coupled neural network (PCNN)-based image fusion network for object detection. Primate vision processing principles, such as expectation driven filtering, state dependent modulation, temporal synchronization, and multiple processing paths are applied to create a physiologically motivated image fusion network. PCNNs are used to fuse the results of several object detection techniques to improve object detection accuracy. Image processing techniques (wavelets, morphological, etc.) are used to extract target features and PCNNs are used to focus attention by segmenting and fusing the information. The object detection property of the resulting image fusion network is demonstrated on mammograms and Forward Looking Infrared Radar (FLIR) images. The network removed 94% of the false detections without removing any true detections in the FLIR images and removed 46% of the false detections while removing only 7% of the true detections in the mammograms. The model exceeded the accuracy obtained by any individual filtering methods or by logical ANDing the individual object detection technique results.


Neurocomputing | 1993

Bayesian selection of important features for feedforward neural networks

Kevin L. Priddy; Steven K. Rogers; Dennis W. Ruck; Gregory L. Tarr; Matthew Kabrisky

Abstract This paper presents a probability of error based method of determining the saliency (usefulness) of input features and hidden nodes. We show that the partial derivative of the output nodes with respect to a given input feature yields a sensitivity measure for the probability of error. This partial derivative provides a saliency metric for determining the sensitivity of the feedforward network trained with a mean squared error learning procedure to a given input feature.


Advanced Optical Technologies | 1990

Artificial neural networks for automatic target recognition

Steven K. Rogers; Dennis W. Ruck; Matthew Kabrisky; Gregory L. Tarr

This paper will review recent advances in the applications of artificial neural network technology to problems in automatic target recognition. The application of feedforward networks for segmentation feature extraction and classification of targets in Forward Looking Infrared (FLIR) and laser radar range scenes will be presented. Biologically inspired Gabor functions will be shown to be a viable alternative to heuristic image processing techniques for segmentation. The use of local transforms such as the Gabor transform fed into a feedforward network is proposed as an architecture for neural based segmentation. Techniques for classification of segmented blobs will be reviewed along with neural network procedures for determining relevant features. A brief review of previous work on comparing neural network based classifiers to conventional Bayesian and K-nearest neighbor techniques will be presented. Results from testing several alternative learning algorithms for these neural network classifiers are presented. A technique for fusing information from multiple sensors using neural networks is presented and conclusions are made. 1


international conference on artificial neural networks | 1991

EFFECTIVE NEURAL NETWORK MODELING IN C

Gregory L. Tarr; Steven K. Rogers; Matthew Kabrisky; Mark E. Oxley; Kevin L. Priddy

A method for constructing neural network models is presented. Coding solutions to common neural network modeling problems are discussed which allow for expandable, portable and readable implementations. Use of dynamic allocation and table dispatch is demonstrated in object oriented, top-down design methods.


IEEE Aerospace and Electronic Systems Magazine | 1990

AFIT neural network research

Steven K. Rogers; Dennis W. Ruck; Matthew Kabrisky; Gregory L. Tarr

A brief summary of research done at the Air Force Institute of Technology (AFIT) in the area of neural networks is provided. It has been shown that backpropagation, used for feedforward artificial neural networks, is just a degenerate version of an extended Kalman filter, and that networks can do about as well as the optimum statistical classification technique. A method of finding the importance of features for use by a neural network classifier has been determined. Techniques for using neural networks for image segmentation have been developed. In optical pattern recognition, techniques that allow the processing of real FLIR (forward-looking infrared) images with existing binary spatial light modulators have been devised. An optical direction of arrival detector applicable to laser illumination direction determination has been designed and tested; the design is similar to a flys eye. Coated mirrors for the optical confocal Fabry-Perot interferometer have been designed, specified, fabricated, and installed. Significant progress has been made in the use of neural networks for processing multiple-feature sets for speech recognition.<<ETX>>


international conference on artificial neural networks | 1991

ACOUSTIC ILLUSIONS: EXPECTATION DIRECTED FILTERING IN THE HUMAN AUDITORY SYSTEM

Gregory L. Tarr; Matthew Kabrisky; Steven K. Rogers; Mark E. Oxley; Kevin L. Priddy

An experiment is conducted to test expectation spectral filtering as an element in the human auditory system. The neural pathways in the auditory system are shown to include both afferent and efferent nerve bundles, of which the efferent paths conduct inhibitory signals to alter the spectral response of the sensor mechanisms in the organ of Corti, based on the expected signal. As expectation models of signal filtering are beyond the current approaches, speech recognition based on biological models may be significantly more difficult than present models would indicate.


Applications of Artificial Neural Networks II | 1991

Generalized neural networks for tactical target image segmentation

Gregory L. Tarr; Steven K. Rogers; Matthew Kabrisky; Mark E. Oxley; Kevin L. Priddy

ABSTRACTA generalized formalism for feedforward neural networks is presented. This generalized architectureis shown to be capable of mapping many common neural network paradigms into a single architecture.Using an intrinsically iterable element, neural networks can be used to compute common preprocessingtechniques including Karhunen-Loeve reduction, Fourier and Gabor spectral decomposition and some wavelet techniques. The generalized architecture is applied to a problem in tactical target image segmentation. 1. A Generalized Neural Network AlgorithmA generalized formulation of feedforward network node can be represented by the equation: Zkfh(XTAX+WTX+Ok) (1)Where T,T, 4, and 9k are propagation constants or weights, and zk is the output of a particular node. Thisformulation, by not specifically addressing training, allows for simple implementation of many commonneural network paradigms, as well as many common preprocessing techniques which include Karhunen-Loeve reduction, Fourier and Gabor spectral decomposition in a connectionist network architecture.Cybenko showed that one hidden layer is sufficient for any multivariate function approximation.3Oxley9 et al proved that the output of the hidden layers can be, not only a sigmoid non-linearity, butalso a negative exponential.9 While Cybenko and others


visual communications and image processing | 1990

Artificial Neural Networks for Pattern Recognition

Steven K. Rogers; Dennis W. Ruck; Matthew Kabrisky; Gregory L. Tarr

A three unit artificial neural network (ANN) automatic target recognition (ATR) system is integrated within, and compared to, a recently AFIT developed conventional ATR system. The integration of ANN within this existing framework allows the determination of where the benefits of using these biologically motivated processing techniques lie. The integration and testing of ANN within each of the three units constitutes the major contribution of this research. The emphasis of this paper is in the area of effects of learning alternatives on ATR. Several alternative feedforward networks were compared in the classifier unit.


Applications of Artificial Neural Networks | 1990

Machine recognition of atomic and molecular species using artificial neural networks

Arthur L. Sumner; Steven K. Rogers; Gregory L. Tarr; Matthew Kabrisky; David Norman

Spectral analysis involving the determination of atomic and molecular species present in a spectm of multi—spectral data is a very time consulTLLng task, especially considering the fact that there are typically thousands of spectra collected during each experiment. Ixie to the overwhelming amount of available spectral data and the time required to analyze these data, a robust autorratic method for doing at least some preliminary spectral analysis is needed. This research focused on the development of a supervised artificial neural network with error correction learning, specifically a three—layer feed-forward backpropagation perceptron. The obj ective was to develop a neural network which would do the preliminary spectral analysis and save the analysts from the task of reviewing thousands of spectral frames . The input to the network is raw spectral data with the output consisting of the classification of both atomic and molecular species in the source.


Applications of Artificial Neural Networks | 1990

AFIT neural network development tools and techniques for modeling articial neural networks

Gregory L. Tarr; Dennis W. Ruck; Steven K. Rogers; Matthew Kabrisky

Modeling of artificial neural networks is shown to depend on the programming decisions made in constructing the algorithms in software. Derivation of a common neural network training rule is shown including the effect of programming constraints. A method for constructing large scale neural network models is presented which allows for efficient use of memory hardware and graphics capabilities. Software engineering techniques are discussed in terms of design methodologies. Application of these techniques is considered for large scale problems including neural network segmentation of digital imagery for target identification. 1.

Collaboration


Dive into the Gregory L. Tarr's collaboration.

Top Co-Authors

Avatar

Steven K. Rogers

Air Force Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Matthew Kabrisky

Air Force Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Dennis W. Ruck

Air Force Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Kevin L. Priddy

Air Force Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Mark E. Oxley

Air Force Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Randy P. Broussard

United States Naval Academy

View shared research outputs
Researchain Logo
Decentralizing Knowledge