Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jinn-Min Yang is active.

Publication


Featured researches published by Jinn-Min Yang.


IEEE Transactions on Geoscience and Remote Sensing | 2009

Kernel Nonparametric Weighted Feature Extraction for Hyperspectral Image Classification

Bor-Chen Kuo; Cheng‐Hsuan Li; Jinn-Min Yang

In recent years, many studies show that kernel methods are computationally efficient, robust, and stable for pattern analysis. Many kernel-based classifiers were designed and applied to classify remote-sensed data, and some results show that kernel-based classifiers have satisfying performances. Many studies about hyperspectral image classification also show that nonparametric weighted feature extraction (NWFE) is a powerful tool for extracting hyperspectral image features. However, NWFE is still based on linear transformation. In this paper, the kernel method is applied to extend NWFE to kernel-based NWFE (KNWFE). The new KNWFE possesses the advantages of both linear and nonlinear transformation, and the experimental results show that KNWFE outperforms NWFE, decision-boundary feature extraction, independent component analysis, kernel-based principal component analysis, and generalized discriminant analysis.


IEEE Transactions on Geoscience and Remote Sensing | 2010

A Nonparametric Feature Extraction and Its Application to Nearest Neighbor Classification for Hyperspectral Image Data

Jinn-Min Yang; Pao-Ta Yu; Bor-Chen Kuo

Feature extraction plays an essential role in hyperspectral image classification. Nonparametric feature extraction algorithms have more advantages than parametric ones and are well suited for nonnormally distributed data along with being able to extract more features than the classic linear discriminant analysis. In this paper, a novel nonparametric feature extraction method, namely, cosine-based nonparametric feature extraction (CNFE), is proposed, in which the weight function embedded in the within-class and between-class scatter matrices is developed based on cosine distance. Moreover, a powerful K-nearest neighbor (KNN) classification algorithm based on the distance metric formed by CNFE features is also developed, which is called the CNFE-based KNN (CKNN) classifier. The effectiveness of the proposed CNFE and CKNN is evaluated by two hyperspectral real data sets. The experimental results demonstrate that both the proposed CNFE and CKNN achieve remarkable performances on various types of training sample sizes, including the small-sample-size cases.


IEEE Transactions on Geoscience and Remote Sensing | 2010

A Dynamic Subspace Method for Hyperspectral Image Classification

Jinn-Min Yang; Bor-Chen Kuo; Pao-Ta Yu; Chun-Hsiang Chuang

Many studies have demonstrated that multiple classifier systems, such as the random subspace method (RSM), obtain more outstanding and robust results than a single classifier on extensive pattern recognition issues. In this paper, we propose a novel subspace selection mechanism, named the dynamic subspace method (DSM), to improve RSM on automatically determining dimensionality and selecting component dimensions for diverse subspaces. Two importance distributions are proposed to impose on the process of constructing ensemble classifiers. One is the distribution of subspace dimensionality, and the other is the distribution of band weights. Based on the two distributions, DSM becomes an automatic, dynamic, and adaptive ensemble. The real data experimental results show that the proposed DSM obtains sound performances than RSM, and that the classification maps remarkably produce fewer speckles.


international geoscience and remote sensing symposium | 2008

Kernel-Based KNN and Gaussian Classifiers for Hyperspectral Image Classification

Bor-Chen Kuo; Jinn-Min Yang; Tian-Wei Sheu; Szu-Wei Yang

In this study, two kernel-based classifiers are applied to hyperspectral image classification. One is the kernel Gaussian classifier, and the other is the kernel k-nearest-neighbor classifier. For classification in feature space, the data are mapped from the input-space into a higher dimensional feature space by utilizing a nonlinear transformation, and then we can perform the k-nearest-neighbor classifier and the Gaussian classifier on the mapped images in that space. Fortunately, instead of doing the expensive transformation of samples, the classification can be performed via inner products in feature space and use the kernel function to efficiently compute the inner products which is the so-called kernel trick. The effectiveness of the proposed classifiers is evaluated by real datasets and other classifiers are included for comparison. The experimental results show that the kernel Gaussian classifier outperforms the others.


international symposium on computer, consumer and control | 2012

Mapping Information Flow of Independent Source to Predict Conscious Level: A Granger Causality Based Brain-Computer Interface

Chun-Hsiang Chuang; Chih-Sheng Huang; Chin-Teng Lin; Li-Wei Ko; Jyh-Yeong Chang; Jinn-Min Yang

Recent studies have shown that the various brain networks over different cognitive states. In contrast to measure a physiological change over a single region, the information flows between brain regions described by effective connectivity provides an informative dynamic over the whole brain. In this study, we proposed a source information flow network based on the combination of Granger causality and support vector regression to predict drivers conscious level. This work provides the first application of using brain network to develop a brain-computer interface and obtain a sound result of performance.


international geoscience and remote sensing symposium | 2007

A novel non-parametric weighted feature extraction method for classification of hyperspectral image with limited training samples

Jinn-Min Yang; Bor-Chen Kuo; Hsiao-Yun Huang; Pao-Ta Yu

In this paper, a novel non-parametric weighted linear feature extraction method has been developed for classifying hyperspectral image data with limited training samples. Within this framework, we found two important vectors for each training sample and calculated the magnitude of projection of the two vectors to weight it when designing the within-class and between-class scatter matrix. The effectiveness of the proposed feature extraction scheme as compared to two other non- parametric feature extraction methods, nonparametric weighted feature extraction (NWFE), and nonparametric discriminant analysis (NDA), is demonstrated using Washington DC Mall data. From the experimental results, the proposed method is remarkably powerful and robust.


international geoscience and remote sensing symposium | 2006

A Novel Fuzzy Linear Feature Extraction for Hyperspectral Image Classification

Jinn-Min Yang; Bor-Chen Kuo; Pao-Ta Yu; T.-Y. Hsieh

In this paper, a novel fuzzy linear discriminant analysis feature extraction method has been proposed for classifying hyperspectral image data. We used the fuzzification mechanism proposed by Keller, et al. to discover samples close to the class boundary or not, and different weights are assigned on samples when estimating the within-class and between-class scatter matrix. The effectiveness of the proposed feature extraction scheme as compared to nonparametric weighted feature extraction (NWFE) and linear discriminant analysis (LDA) is demonstrated using Washington DC Mall data, and achieve a satisfactory performance.


Archive | 2010

Nonparametric Fuzzy Feature Extraction for Hyperspectral Image Classification

Jinn-Min Yang; Pao-Ta Yu; Bor-Chen Kuo; Ming-Hsiang Su


資訊科技國際期刊,4(2) | 2010

A Novel K-Nearest Neighbor Classifier Based on Adaptive Metric Formed by Features Extracted by Nonparametric Feature Extraction Model

Jinn-Min Yang; Ming-Hsiang Su; Pao-Ta Yu


Information Systems | 2012

Mapping Information Flow of Independent Source to Predict Conscious Level A Granger Causality based Brain-computer Interface

Chun-Hsiang Chuang; Chih-Sheng Huang; Chin-Teng Lin; Li-Wei Ko; Jyh-Yeong Chang; Jinn-Min Yang

Collaboration


Dive into the Jinn-Min Yang's collaboration.

Top Co-Authors

Avatar

Bor-Chen Kuo

National Taichung University of Education

View shared research outputs
Top Co-Authors

Avatar

Pao-Ta Yu

National Chung Cheng University

View shared research outputs
Top Co-Authors

Avatar

Chun-Hsiang Chuang

National Chiao Tung University

View shared research outputs
Top Co-Authors

Avatar

Chih-Sheng Huang

National Chiao Tung University

View shared research outputs
Top Co-Authors

Avatar

Jyh-Yeong Chang

National Chiao Tung University

View shared research outputs
Top Co-Authors

Avatar

Li-Wei Ko

National Chiao Tung University

View shared research outputs
Top Co-Authors

Avatar

Hsiao-Yun Huang

Fu Jen Catholic University

View shared research outputs
Top Co-Authors

Avatar

Szu-Wei Yang

National Taichung University of Education

View shared research outputs
Researchain Logo
Decentralizing Knowledge