Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Kyung Ha Seok is active.

Publication


Featured researches published by Kyung Ha Seok.


Fuzzy Sets and Systems | 2006

Support vector interval regression machine for crisp input and output data

Changha Hwang; Dug Hun Hong; Kyung Ha Seok

Support vector regression (SVR) has been very successful in function estimation problems for crisp data. In this paper, we propose a robust method to evaluate interval regression models for crisp input and output data combining the possibility estimation formulation integrating the property of central tendency with the principle of standard SVR. The proposed method is robust in the sense that outliers do not affect the resulting interval regression. Furthermore, the proposed method is model-free method, since we do not have to assume the underlying model function for interval nonlinear regression model with crisp input and output. In particular, this method performs better and is conceptually simpler than support vector interval regression networks (SVIRNs) which utilize two radial basis function networks to identify the upper and lower sides of data interval. Five examples are provided to show the validity and applicability of the proposed method.


Neurocomputing | 2011

Semiparametric mixed-effect least squares support vector machine for analyzing pharmacokinetic and pharmacodynamic data

Kyung Ha Seok; Jooyong Shim; Daehyeon Cho; Gyu-Jeong Noh; Changha Hwang

In this paper we propose a semiparametric mixed-effect least squares support vector machine (LS-SVM) regression model for the analysis of pharmacokinetic (PK) and pharmacodynamic (PD) data. We also develop the generalized cross-validation (GCV) method for choosing the hyperparameters which affect the performance of the proposed LS-SVM. The performance of the proposed LS-SVM is compared with those of NONMEM and the regular semiparametric LS-SVM via four measures, which are mean squared error (MSE), mean absolute error (MAE), mean relative absolute error (MRAE) and mean relative prediction error (MRPE). Through paired-t test statistic we find that the absolute values of four measures of the proposed LS-SVM are significantly smaller than those of NONMEM for PK and PD data. We also investigate the coefficient of determinations R^2s of predicted and observed values. The R^2s of NONMEM are 0.66 and 0.59 for PK and PD data, respectively, while the R^2s of the proposed LS-SVM are 0.94 and 0.96. Through cross validation technique we also find that the proposed LS-SVM shows better generalization performance than the regular semiparametric LS-SVM for PK and PD data. These facts indicate that the proposed LS-SVM is an appealing tool for analyzing PK and PD data.


international conference on education technology and computer | 2010

Support vector quantile regression using asymmetric e-insensitive loss function

Kyung Ha Seok; Daehyeon Cho; Changha Hwang; Jooyong Shim

Support vector quantile regression (SVQR) is capable of providing a good description of the linear and nonlinear relationships among random variables. In this paper we propose a sparse SVQR to overcome a weak point of SVQR, nonsparsity. The asymmetric e-insensitive loss function is used to efficiently provide the sparsity Experimental results are then presented; these results illustrate the performance of the proposed method by comparing it with nonsparse SVQR.


fuzzy systems and knowledge discovery | 2006

Some comments on error correcting output codes

Kyung Ha Seok; Daehyeon Cho

Error Correction Output Codes (ECOC) can improve generalization performance when applied to multiclass problems. In this paper, we compared various criteria used to design codematrices. We also investigated how loss functions affect the results of ECOC. We found that there was no clear evidence of difference between the various criteria used to design codematrices. The One Per Class (OPC) codematrix with Hamming loss yields a higher error rate. The error rate from margin based decoding is lower than from Hamming decoding. Some comments on ECOC are made, and its efficacy is investigated through empirical study.


computational intelligence and security | 2006

Locally Weighted LS-SVM for Fuzzy Nonlinear Regression with Fuzzy Input-Output

Dug Hun Hong; Changha Hwang; Joo Yong Shim; Kyung Ha Seok

This paper deals with new regression method of predicting fuzzy multivariable nonlinear regression models using triangular fuzzy numbers. The proposed method is achieved by implementing the locally weighted least squares support vector machine regression where the local weight is obtained from the positive distance metric between the test data and the training data. Two types of distance metrics for the center and spreads are proposed to treat the nonlinear regression for fuzzy inputs and fuzzy outputs. Numerical studies are then presented which indicate the performance of this algorithm


Computational Statistics | 2009

Non-crossing quantile regression via doubly penalized kernel machine

Jooyong Shim; Changha Hwang; Kyung Ha Seok


Journal of the Korean Data and Information Science Society | 2010

Doubly penalized kernel method for heteroscedastic autoregressive datay

Daehyeon Cho; Joo-Yong Shim; Kyung Ha Seok


Journal of the Korean Data and Information Science Society | 2012

Semiparametric kernel logistic regression with longitudinal data

Jooyong Shim; Kyung Ha Seok


Journal of the Korean Data and Information Science Society | 2012

Study on the ensemble methods with kernel ridge regression

Sunhwa Kim; Daehyeon Cho; Kyung Ha Seok


한국데이터정보과학회지 = Journal of the Korean Data & Information Science Society | 2014

Support vector expectile regression using IRWLS procedure

Kook Lyeol Choi ; Joo Yong Shim; Kyung Ha Seok

Collaboration


Dive into the Kyung Ha Seok's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Joo Yong Shim

Catholic University of Daegu

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Hye-Jung Park

Catholic University of Daegu

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge