Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where S. Sathiya Keerthi is active.

Publication


Featured researches published by S. Sathiya Keerthi.


Neural Computation | 2001

Improvements to Platt's SMO Algorithm for SVM Classifier Design

S. Sathiya Keerthi; Shirish Krishnaj Shevade; Chiranjib Bhattacharyya; K. R. K. Murthy

This article points out an important source of inefficiency in Platts sequential minimal optimization (SMO) algorithm that is caused by the use of a single threshold value. Using clues from the KKT conditions for the dual problem, two threshold parameters are employed to derive modifications of SMO. These modified algorithms perform significantly faster than the original SMO on all benchmark data sets tried.


Neural Computation | 2003

Asymptotic behaviors of support vector machines with Gaussian kernel

S. Sathiya Keerthi; Chih-Jen Lin

Support vector machines (SVMs) with the gaussian (RBF) kernel have been popular for practical use. Model selection in this class of SVMs involves two hyper parameters: the penalty parameter C and the kernel width . This letter analyzes the behavior of the SVM classifier when these hyper parameters take very small or very large values. Our results help in understanding the hyperparameter space that leads to an efficient heuristic method of searching for hyperparameter values with small generalization errors. The analysis also indicates that if complete model selection using the gaussian kernel has been conducted, there is no need to consider linear SVM.


Journal of Optimization Theory and Applications | 1988

Optimal infinite-horizon feedback laws for a general class of constrained discrete-time systems: stability and moving-horizon approximations

S. Sathiya Keerthi; Elmer G. Gilbert

Stability results are given for a class of feedback systems arising from the regulation of time-varying discrete-time systems using optimal infinite-horizon and moving-horizon feedback laws. The class is characterized by joint constraints on the state and the control, a general nonlinear cost function and nonlinear equations of motion possessing two special properties. It is shown that weak conditions on the cost function and the constraints are sufficient to guarantee uniform asymptotic stability of both the optimal infinite-horizon and moving-horizon feedback systems. The infinite-horizon cost associated with the moving-horizon feedback law approaches the optimal infinite-horizon cost as the moving horizon is extended.


international conference on machine learning | 2008

A dual coordinate descent method for large-scale linear SVM

Cho-Jui Hsieh; Kai-Wei Chang; Chih-Jen Lin; S. Sathiya Keerthi; S. Sundararajan

In many applications, data appear with a huge number of instances as well as features. Linear Support Vector Machines (SVM) is one of the most popular tools to deal with such large-scale sparse data. This paper presents a novel dual coordinate descent method for linear SVM with L1-and L2-loss functions. The proposed method is simple and reaches an ε-accurate solution in O(log(1/ε)) iterations. Experiments indicate that our method is much faster than state of the art solvers such as Pegasos, TRON, SVMperf, and a recent primal coordinate descent implementation.


IEEE Transactions on Neural Networks | 2000

Improvements to the SMO algorithm for SVM regression

Shirish Krishnaj Shevade; S. Sathiya Keerthi; Chiranjib Bhattacharyya; K. R. K. Murthy

This paper points out an important source of inefficiency in Smola and Schölkopfs sequential minimal optimization (SMO) algorithm for support vector machine (SVM) regression that is caused by the use of a single threshold value. Using clues from the KKT conditions for the dual problem, two threshold parameters are employed to derive modifications of SMO for regression. These modified algorithms perform significantly faster than the original SMO on the datasets tried.


Neurocomputing | 2003

Evaluation of simple performance measures for tuning SVM hyperparameters

Kaibo Duan; S. Sathiya Keerthi; Aun Neow Poo

Choosing optimal hyperparameter values for support vector machines is an important step in SVM design. This is usually done by minimizing either an estimate of generalization error or some other related performance measure. In this paper, we empirically study the usefulness of several simple performance measures that are inexpensive to compute (in the sense that they do not require expensive matrix operations involving the kernel matrix). The results point out which of these measures are adequate functionals for tuning SVM hyperparameters. For SVMs with L1 soft-margin formulation, none of the simple measures yields a performance uniformly as good as k-fold cross validation; Joachims’ Xi-Alpha bound and the GACV of Wahba et al. come next and perform reasonably well. For SVMs with L2 soft-margin formulation, the radius margin bound gives a very good prediction of optimal hyperparameter values.


international conference on multiple classifier systems | 2005

Which is the best multiclass SVM method? an empirical study

Kai-Bo Duan; S. Sathiya Keerthi

Multiclass SVMs are usually implemented by combining several two-class SVMs. The one-versus-all method using winner-takes-all strategy and the one-versus-one method implemented by max-wins voting are popularly used for this purpose. In this paper we give empirical evidence to show that these methods are inferior to another one-versus-one method: one that uses Platts posterior probabilities together with the pairwise coupling idea of Hastie and Tibshirani. The evidence is particularly strong when the training dataset is sparse.


IEEE Transactions on Neural Networks | 2000

A fast iterative nearest point algorithm for support vector machine classifier design

S. Sathiya Keerthi; Shirish Krishnaj Shevade; Chiranjib Bhattacharyya; K. R. K. Murthy

In this paper we give a new fast iterative algorithm for support vector machine (SVM) classifier design. The basic problem treated is one that does not allow classification violations. The problem is converted to a problem of computing the nearest point between two convex polytopes. The suitability of two classical nearest point algorithms, due to Gilbert, and Mitchell et al., is studied. Ideas from both these algorithms are combined and modified to derive our fast algorithm. For problems which require classification violations to be allowed, the violations are quadratically penalized and an idea due to Cortes and Vapnik and Friess is used to convert it to a problem in which there are no classification violations. Comparative computational evaluation of our algorithm against powerful SVM methods such as Platts sequential minimal optimization shows that our algorithm is very competitive.


IEEE Transactions on Neural Networks | 2002

Efficient tuning of SVM hyperparameters using radius/margin bound and iterative algorithms

S. Sathiya Keerthi

The paper discusses implementation issues related to the tuning of the hyperparameters of a support vector machine (SVM) with L/sub 2/ soft margin, for which the radius/margin bound is taken as the index to be minimized, and iterative techniques are employed for computing radius and margin. The implementation is shown to be feasible and efficient, even for large problems having more than 10000 support vectors.


Bioinformatics | 2003

A simple and efficient algorithm for gene selection using sparse logistic regression.

Shirish Krishnaj Shevade; S. Sathiya Keerthi

MOTIVATION This paper gives a new and efficient algorithm for the sparse logistic regression problem. The proposed algorithm is based on the Gauss-Seidel method and is asymptotically convergent. It is simple and extremely easy to implement; it neither uses any sophisticated mathematical programming software nor needs any matrix operations. It can be applied to a variety of real-world problems like identifying marker genes and building a classifier in the context of cancer diagnosis using microarray data. RESULTS The gene selection method suggested in this paper is demonstrated on two real-world data sets and the results were found to be consistent with the literature. AVAILABILITY The implementation of this algorithm is available at the site http://guppy.mpe.nus.edu.sg/~mpessk/SparseLOGREG.shtml SUPPLEMENTARY INFORMATION Supplementary material is available at the site http://guppy.mpe.nus.edu.sg/~mpessk/SparseLOGREG.shtml

Collaboration


Dive into the S. Sathiya Keerthi's collaboration.

Top Co-Authors

Avatar

Chong Jin Ong

National University of Singapore

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Chih-Jen Lin

National Taiwan University

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge