Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where David A. Landgrebe is active.

Publication


Featured researches published by David A. Landgrebe.


systems man and cybernetics | 1991

A survey of decision tree classifier methodology

S.R. Safavian; David A. Landgrebe

A survey is presented of current methods for decision tree classifier (DTC) designs and the various existing issues. After considering potential advantages of DTCs over single-state classifiers, the subjects of tree structure design, feature selection at each internal node, and decision and search strategies are discussed. The relation between decision trees and neutral networks (NN) is also discussed. >


IEEE Signal Processing Magazine | 2002

Hyperspectral image data analysis

David A. Landgrebe

The fundamental basis for space-based remote sensing is that information is potentially available from the electromagnetic energy field arising from the Earths surface and, in particular, from the spatial, spectral, and temporal variations in that field. Rather than focusing on the spatial variations, which imagery perhaps best conveys, why not move on to look at how the spectral variations might be used. The idea was to enlarge the size of a pixel until it includes an area that is characteristic from a spectral response standpoint for the surface cover to be discriminated. The article includes an example of an image space representation, using three bands to simulate a color IR photograph of an airborne hyperspectral data set over the Washington, DC, mall.


international geoscience and remote sensing symposium | 1994

The effect of unlabeled samples in reducing the small sample size problem and mitigating the Hughes phenomenon

Behzad M. Shahshahani; David A. Landgrebe

In this paper, we study the use of unlabeled sam- ples in reducing the problem of small training sample size that can severely affect the recognition rate of classifiers when the dimensionality of the multispectral data is high. We show that by using additional unlabeled samples that are available at no extra cost, the performance may be improved, and therefore the Hughes phenomenon can be mitigated. Furthermore, by ex- periments, we show that by using additional unlabeled samples more representative estimates can be obtained. We also pro- pose a semiparametric method for incorporating the training (Le., labeled) and unlabeled samples simultaneously into the parameter estimation process.


systems man and cybernetics | 1998

Supervised classification in high-dimensional space: geometrical, statistical, and asymptotical properties of multivariate data

Luis O. Jimenez; David A. Landgrebe

The recent development of more sophisticated remote-sensing systems enables the measurement of radiation in many more spectral intervals than was previously possible. An example of this technology is the AVIRIS system, which collects image data in 220 bands. The increased dimensionality of such hyperspectral data greatly enhances the datas information content, but provides a challenge to the current techniques for analyzing such data. Human experience in 3D space tends to mislead our intuition of geometrical and statistical properties in high-dimensional space, properties which must guide our choices in the data analysis process. Using Euclidean and Cartesian geometry, high-dimensional space properties are investigated in this paper, and their implication for high-dimensional data and its analysis is studied in order to illuminate the differences between conventional spaces and hyperdimensional space.


IEEE Transactions on Pattern Analysis and Machine Intelligence | 1993

Feature extraction based on decision boundaries

Chulhee Lee; David A. Landgrebe

A novel approach to feature extraction for classification based directly on the decision boundaries is proposed. It is shown how discriminantly redundant features and discriminantly informative features are related to decision boundaries. A procedure to extract discriminantly informative features based on a decision boundary is proposed. The proposed feature extraction algorithm has several desirable properties: (1) it predicts the minimum number of features necessary to achieve the same classification accuracy as in the original space for a given pattern recognition problem; and (2) it finds the necessary feature vectors. The proposed algorithm does not deteriorate under the circumstances of equal class means or equal class covariances as some previous algorithms do. Experiments show that the performance of the proposed algorithm compares favorably with those of previous algorithms. >


IEEE Transactions on Geoscience and Remote Sensing | 2004

Nonparametric weighted feature extraction for classification

Bor-Chen Kuo; David A. Landgrebe

In this paper, a new nonparametric feature extraction method is proposed for high-dimensional multiclass pattern recognition problems. It is based on a nonparametric extension of scatter matrices. There are at least two advantages to using the proposed nonparametric scatter matrices. First, they are generally of full rank. This provides the ability to specify the number of extracted features desired and to reduce the effect of the singularity problem. This is in contrast to parametric discriminant analysis, which usually only can extract L-1 (number of classes minus one) features. In a real situation, this may not be enough. Second, the nonparametric nature of scatter matrices reduces the effects of outliers and works well even for nonnormal datasets. The new method provides greater weight to samples near the expected decision boundary. This tends to provide for increased classification accuracy.


IEEE Transactions on Pattern Analysis and Machine Intelligence | 1996

Covariance matrix estimation and classification with limited training data

Joseph P. Hoffbeck; David A. Landgrebe

A new covariance matrix estimator useful for designing classifiers with limited training data is developed. In experiments, this estimator achieved higher classification accuracy than the sample covariance matrix and common covariance matrix estimates. In about half of the experiments, it achieved higher accuracy than regularized discriminant analysis, but required much less computation.


IEEE Transactions on Geoscience and Remote Sensing | 1999

Hyperspectral data analysis and supervised feature reduction via projection pursuit

Luis O. Jimenez; David A. Landgrebe

As the number of spectral bands of high-spectral resolution data increases, the ability to detect more detailed classes should also increase, and the classification accuracy should increase as well. Often the number of labelled samples used for supervised classification techniques is limited, thus limiting the precision with which class characteristics can be estimated. As the number of spectral bands becomes large, the limitation on performance imposed by the limited number of training samples can become severe. A number of techniques for case-specific feature extraction have been developed to reduce dimensionality without loss of class separability. Most of these techniques require the estimation of statistics at full dimensionality in order to extract relevant features for classification. If the number of training samples is not adequately large, the estimation of parameters in high-dimensional data will not be accurate enough. As a result, the estimated features may not be as effective as they could be. This suggests the need for reducing the dimensionality via a preprocessing method that takes into consideration high-dimensional feature-space properties. Such reduction should enable the estimation of feature-extraction parameters to be more accurate. Using a technique referred to as projection pursuit (PP), such an algorithm has been developed. This technique is able to bypass many of the problems of the limitation of small numbers of training samples by making the computations in a lower-dimensional space, and optimizing a function called the projection index. A current limitation of this method is that, as the number of dimensions increases, it is likely that a local maximum of the projection index will be found that does not enable one to fully exploit hyperspectral-data capabilities.


IEEE Transactions on Geoscience and Remote Sensing | 1999

Covariance estimation with limited training samples

Saldju Tadjudin; David A. Landgrebe

This paper describes a covariance estimator formulated under an empirical Bayesian setting to mitigate the problem of limited training samples in the Gaussian maximum likelihood (ML) classification for remote sensing. The most suitable covariance mixture is selected by maximizing the average leave-one-out log likelihood. Experimental results using AVIRIS data are presented.


IEEE Transactions on Geoscience and Remote Sensing | 2001

An adaptive classifier design for high-dimensional data analysis with a limited training data set

Qiong Jackson; David A. Landgrebe

Proposes a self-learning and self-improving adaptive classifier to mitigate the problem of small training sample size that can severely affect the recognition accuracy of classifiers when the dimensionality of the multispectral data is high. This proposed adaptive classifier utilizes classified samples (referred to as semilabeled samples) in addition to original training samples iteratively. In order to control the influence of semilabeled samples, the proposed method gives full weight to the training samples and reduced weight to semilabeled samples. The authors show that by using additional semilabeled samples that are available without extra cost, the additional class label information may be extracted and utilized to enhance statistics estimation and hence improve the classifier performance, and therefore the Hughes phenomenon (peak phenomenon) may be mitigated. Experimental results show this proposed adaptive classifier can improve the classification accuracy as well as representation of estimated statistics significantly.

Collaboration


Dive into the David A. Landgrebe's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

John P. Kerekes

Rochester Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

John W. Lund

Oregon Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Leon Trilling

Massachusetts Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge