Jinn-Min Yang
National Chung Cheng University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Jinn-Min Yang.
IEEE Transactions on Geoscience and Remote Sensing | 2009
Bor-Chen Kuo; Cheng‐Hsuan Li; Jinn-Min Yang
In recent years, many studies show that kernel methods are computationally efficient, robust, and stable for pattern analysis. Many kernel-based classifiers were designed and applied to classify remote-sensed data, and some results show that kernel-based classifiers have satisfying performances. Many studies about hyperspectral image classification also show that nonparametric weighted feature extraction (NWFE) is a powerful tool for extracting hyperspectral image features. However, NWFE is still based on linear transformation. In this paper, the kernel method is applied to extend NWFE to kernel-based NWFE (KNWFE). The new KNWFE possesses the advantages of both linear and nonlinear transformation, and the experimental results show that KNWFE outperforms NWFE, decision-boundary feature extraction, independent component analysis, kernel-based principal component analysis, and generalized discriminant analysis.
IEEE Transactions on Geoscience and Remote Sensing | 2010
Jinn-Min Yang; Pao-Ta Yu; Bor-Chen Kuo
Feature extraction plays an essential role in hyperspectral image classification. Nonparametric feature extraction algorithms have more advantages than parametric ones and are well suited for nonnormally distributed data along with being able to extract more features than the classic linear discriminant analysis. In this paper, a novel nonparametric feature extraction method, namely, cosine-based nonparametric feature extraction (CNFE), is proposed, in which the weight function embedded in the within-class and between-class scatter matrices is developed based on cosine distance. Moreover, a powerful K-nearest neighbor (KNN) classification algorithm based on the distance metric formed by CNFE features is also developed, which is called the CNFE-based KNN (CKNN) classifier. The effectiveness of the proposed CNFE and CKNN is evaluated by two hyperspectral real data sets. The experimental results demonstrate that both the proposed CNFE and CKNN achieve remarkable performances on various types of training sample sizes, including the small-sample-size cases.
IEEE Transactions on Geoscience and Remote Sensing | 2010
Jinn-Min Yang; Bor-Chen Kuo; Pao-Ta Yu; Chun-Hsiang Chuang
Many studies have demonstrated that multiple classifier systems, such as the random subspace method (RSM), obtain more outstanding and robust results than a single classifier on extensive pattern recognition issues. In this paper, we propose a novel subspace selection mechanism, named the dynamic subspace method (DSM), to improve RSM on automatically determining dimensionality and selecting component dimensions for diverse subspaces. Two importance distributions are proposed to impose on the process of constructing ensemble classifiers. One is the distribution of subspace dimensionality, and the other is the distribution of band weights. Based on the two distributions, DSM becomes an automatic, dynamic, and adaptive ensemble. The real data experimental results show that the proposed DSM obtains sound performances than RSM, and that the classification maps remarkably produce fewer speckles.
international geoscience and remote sensing symposium | 2008
Bor-Chen Kuo; Jinn-Min Yang; Tian-Wei Sheu; Szu-Wei Yang
In this study, two kernel-based classifiers are applied to hyperspectral image classification. One is the kernel Gaussian classifier, and the other is the kernel k-nearest-neighbor classifier. For classification in feature space, the data are mapped from the input-space into a higher dimensional feature space by utilizing a nonlinear transformation, and then we can perform the k-nearest-neighbor classifier and the Gaussian classifier on the mapped images in that space. Fortunately, instead of doing the expensive transformation of samples, the classification can be performed via inner products in feature space and use the kernel function to efficiently compute the inner products which is the so-called kernel trick. The effectiveness of the proposed classifiers is evaluated by real datasets and other classifiers are included for comparison. The experimental results show that the kernel Gaussian classifier outperforms the others.
international symposium on computer, consumer and control | 2012
Chun-Hsiang Chuang; Chih-Sheng Huang; Chin-Teng Lin; Li-Wei Ko; Jyh-Yeong Chang; Jinn-Min Yang
Recent studies have shown that the various brain networks over different cognitive states. In contrast to measure a physiological change over a single region, the information flows between brain regions described by effective connectivity provides an informative dynamic over the whole brain. In this study, we proposed a source information flow network based on the combination of Granger causality and support vector regression to predict drivers conscious level. This work provides the first application of using brain network to develop a brain-computer interface and obtain a sound result of performance.
international geoscience and remote sensing symposium | 2007
Jinn-Min Yang; Bor-Chen Kuo; Hsiao-Yun Huang; Pao-Ta Yu
In this paper, a novel non-parametric weighted linear feature extraction method has been developed for classifying hyperspectral image data with limited training samples. Within this framework, we found two important vectors for each training sample and calculated the magnitude of projection of the two vectors to weight it when designing the within-class and between-class scatter matrix. The effectiveness of the proposed feature extraction scheme as compared to two other non- parametric feature extraction methods, nonparametric weighted feature extraction (NWFE), and nonparametric discriminant analysis (NDA), is demonstrated using Washington DC Mall data. From the experimental results, the proposed method is remarkably powerful and robust.
international geoscience and remote sensing symposium | 2006
Jinn-Min Yang; Bor-Chen Kuo; Pao-Ta Yu; T.-Y. Hsieh
In this paper, a novel fuzzy linear discriminant analysis feature extraction method has been proposed for classifying hyperspectral image data. We used the fuzzification mechanism proposed by Keller, et al. to discover samples close to the class boundary or not, and different weights are assigned on samples when estimating the within-class and between-class scatter matrix. The effectiveness of the proposed feature extraction scheme as compared to nonparametric weighted feature extraction (NWFE) and linear discriminant analysis (LDA) is demonstrated using Washington DC Mall data, and achieve a satisfactory performance.
Archive | 2010
Jinn-Min Yang; Pao-Ta Yu; Bor-Chen Kuo; Ming-Hsiang Su
資訊科技國際期刊,4(2) | 2010
Jinn-Min Yang; Ming-Hsiang Su; Pao-Ta Yu
Information Systems | 2012
Chun-Hsiang Chuang; Chih-Sheng Huang; Chin-Teng Lin; Li-Wei Ko; Jyh-Yeong Chang; Jinn-Min Yang