Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Cheong Hee Park is active.

Publication


Featured researches published by Cheong Hee Park.


IEEE Transactions on Pattern Analysis and Machine Intelligence | 2004

An optimization criterion for generalized discriminant analysis on undersampled problems

Jieping Ye; Ravi Janardan; Cheong Hee Park; Haesun Park

An optimization criterion is presented for discriminant analysis. The criterion extends the optimization criteria of the classical Linear Discriminant Analysis (LDA) through the use of the pseudoinverse when the scatter matrices are singular. It is applicable regardless of the relative sizes of the data dimension and sample size, overcoming a limitation of classical LDA. The optimization problem can be solved analytically by applying the Generalized Singular Value Decomposition (GSVD) technique. The pseudoinverse has been suggested and used for undersampled problems in the past, where the data dimension exceeds the number of data points. The criterion proposed in this paper provides a theoretical justification for this procedure. An approximation algorithm for the GSVD-based approach is also presented. It reduces the computational complexity by finding subclusters of each cluster and uses their centroids to capture the structure of each cluster. This reduced problem yields much smaller matrices to which the GSVD can be applied efficiently. Experiments on text data, with up to 7,000 dimensions, show that the approximation algorithm produces results that are close to those produced by the exact algorithm.


Pattern Recognition | 2008

A comparison of generalized linear discriminant analysis algorithms

Cheong Hee Park; Haesun Park

Linear discriminant analysis (LDA) is a dimension reduction method which finds an optimal linear transformation that maximizes the class separability. However, in undersampled problems where the number of data samples is smaller than the dimension of data space, it is difficult to apply LDA due to the singularity of scatter matrices caused by high dimensionality. In order to make LDA applicable, several generalizations of LDA have been proposed recently. In this paper, we present theoretical and algorithmic relationships among several generalized LDA algorithms and compare their computational complexities and performances in text classification and face recognition. Towards a practical dimension reduction method for high dimensional data, an efficient algorithm is proposed, which reduces the computational complexity greatly while achieving competitive prediction accuracies. We also present nonlinear extensions of these LDA algorithms based on kernel methods. It is shown that a generalized eigenvalue problem can be formulated in the kernel-based feature space, and generalized LDA algorithms are applied to solve the generalized eigenvalue problem, resulting in nonlinear discriminant analysis. Performances of these linear and nonlinear discriminant analysis algorithms are compared extensively.


Pattern Recognition | 2005

Fingerprint classification using fast Fourier transform and nonlinear discriminant analysis

Cheong Hee Park; Haesun Park

In this paper, we present a new approach for fingerprint classification based on discrete Fourier transform (DFT) and nonlinear discriminant analysis. Utilizing the DFT and directional filters, a reliable and efficient directional image is constructed from each fingerprint image, and then nonlinear discriminant analysis is applied to the constructed directional images, reducing the dimension dramatically and extracting the discriminant features. The proposed method explores the capability of DFT and directional filtering in dealing with low-quality images and the effectiveness of nonlinear feature extraction method in fingerprint classification. Experimental results demonstrates competitive performance compared with other published results.


SIAM Journal on Matrix Analysis and Applications | 2005

Nonlinear Discriminant Analysis Using Kernel Functions and the Generalized Singular Value Decomposition

Cheong Hee Park; Haesun Park

Linear discriminant analysis (LDA) has been widely used for linear dimension reduction. However, LDA has limitations in that one of the scatter matrices is required to be nonsingular and the nonlinearly clustered structure is not easily captured. In order to overcome the problems caused by the singularity of the scatter matrices, a generalization of LDA based on the generalized singular value decomposition (GSVD) was recently developed. In this paper, we propose a nonlinear discriminant analysis based on the kernel method and the GSVD. The GSVD is applied to solve the generalized eigenvalue problem which is formulated in the feature space defined by a nonlinear mapping through kernel functions. Our GSVD-based kernel discriminant analysis is theoretically compared with other kernel-based nonlinear discriminant analysis algorithms. The experimental results show that our method is an effective nonlinear dimension reduction method.


Pattern Recognition Letters | 2008

On applying linear discriminant analysis for multi-labeled problems

Cheong Hee Park; Moonhwi Lee

Linear discriminant analysis (LDA) is one of the most popular dimension reduction methods, but it is originally focused on a single-labeled problem. In this paper, we derive the formulation for applying LDA for a multi-labeled problem. We also propose a generalized LDA algorithm which is effective in a high dimensional multi-labeled problem. Experimental results demonstrate that by considering multi-labeled structure, LDA can achieve computational efficiency and also improve classification performances.


Pattern Recognition | 2004

Nonlinear feature extraction based on centroids and kernel functions

Cheong Hee Park; Haesun Park

A nonlinear feature extraction method is presented which can reduce the data dimension down to the number of classes, providing dramatic savings in computational costs. The dimension reducing nonlinear transformation is obtained by implicitly mapping the input data into a feature space using a kernel function, and then finding a linear mapping based on an orthonormal basis of centroids in the feature space that maximally separates the between-class relationship. The experimental results demonstrate that our method is capable of extracting nonlinear features effectively so that competitive performance of classification can be obtained with linear classifiers in the dimension reduced space.


international conference on data mining | 2003

A new optimization criterion for generalized discriminant analysis on undersampled problems

Jieping Ye; Ravi Janardan; Cheong Hee Park; Haesun Park

A new optimization criterion for discriminant analysis is presented. The new criterion extends the optimization criteria of the classical linear discriminant analysis (LDA) by introducing the pseudo-inverse when the scatter matrices are singular. It is applicable regardless of the relative sizes of the data dimension and sample size, overcoming a limitation of the classical LDA. Recently, a new algorithm called LDA/GSVD for structure-preserving dimension reduction has been introduced, which extends the classical LDA to very high-dimensional undersampled problems by using the generalized singular value decomposition (GSVD). The solution from the LDA/GSVD algorithm is a special case of the solution for our generalized criterion, which is also based on GSVD. We also present an approximate solution for our GSVD-based solution, which reduces computational complexity by finding subclusters of each cluster, and using their centroids to capture the structure of each cluster. This reduced problem yields much smaller matrices of which the GSVD can be applied efficiently. Experiments on text data, with up to 7000 dimensions, show that the approximation algorithm produces results that are close to those produced by the exact algorithm.


International Journal of Applied Mathematics and Computer Science | 2011

Analysis of correlation based dimension reduction methods

Yong Joon Shin; Cheong Hee Park

Analysis of correlation based dimension reduction methods Dimension reduction is an important topic in data mining and machine learning. Especially dimension reduction combined with feature fusion is an effective preprocessing step when the data are described by multiple feature sets. Canonical Correlation Analysis (CCA) and Discriminative Canonical Correlation Analysis (DCCA) are feature fusion methods based on correlation. However, they are different in that DCCA is a supervised method utilizing class label information, while CCA is an unsupervised method. It has been shown that the classification performance of DCCA is superior to that of CCA due to the discriminative power using class label information. On the other hand, Linear Discriminant Analysis (LDA) is a supervised dimension reduction method and it is known as a special case of CCA. In this paper, we analyze the relationship between DCCA and LDA, showing that the projective directions by DCCA are equal to the ones obtained from LDA with respect to an orthogonal transformation. Using the relation with LDA, we propose a new method that can enhance the performance of DCCA. The experimental results show that the proposed method exhibits better classification performance than the original DCCA.


Expert Systems With Applications | 2009

A SVM-based discretization method with application to associative classification

Cheong Hee Park; Moonhwi Lee

Associative classification has been recently proposed which combines association rule mining and classification, and many studies have shown that associative classifiers give high prediction accuracies compared with other traditional classifiers such as a decision tree. However, in order to apply association rule mining to classification problems, data transformation into the form of transaction data should be preceded before applying association rule mining. In this paper, we propose a discretization method based on Support vector machines, which can improve the performance of association classification greatly. The proposed method finds optimal class boundaries by using SVM, and discretization utilizing distances to the boundaries is performed. Experimental results demonstrate that performing SVM-based discretization for continuous attributes makes associative classification more effective in that it reduces the number of classification rules mined and also improves the prediction accuracies at the same time.


international conference on data mining | 2004

A comparative study of linear and nonlinear feature extraction methods

Cheong Hee Park; Haesun Park; Panos M. Pardalos

This paper presents theoretical relationships among several generalized LDA algorithms and proposes computationally efficient approaches for them utilizing the relationships. Generalized LDA algorithms are extended nonlinearly by kernel methods resulting in nonlinear discriminant analysis. Performances and computational complexities of these linear and nonlinear discriminant analysis algorithms are compared.

Collaboration


Dive into the Cheong Hee Park's collaboration.

Top Co-Authors

Avatar

Haesun Park

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Hyunsuk Kim

Electronics and Telecommunications Research Institute

View shared research outputs
Top Co-Authors

Avatar

Moonhwi Lee

Chungnam National University

View shared research outputs
Top Co-Authors

Avatar

Dongjin Lee

Electronics and Telecommunications Research Institute

View shared research outputs
Top Co-Authors

Avatar

Ho-Sub Yoon

Electronics and Telecommunications Research Institute

View shared research outputs
Top Co-Authors

Avatar

Daesub Yoon

Electronics and Telecommunications Research Institute

View shared research outputs
Top Co-Authors

Avatar

Chankyu Park

Electronics and Telecommunications Research Institute

View shared research outputs
Top Co-Authors

Avatar

Hoyoung Woo

Chungnam National University

View shared research outputs
Top Co-Authors

Avatar

Hyun-Eui Kim

Electronics and Telecommunications Research Institute

View shared research outputs
Top Co-Authors

Avatar

Jaehong Kim

Electronics and Telecommunications Research Institute

View shared research outputs
Researchain Logo
Decentralizing Knowledge