Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Rameswar Debnath is active.

Publication


Featured researches published by Rameswar Debnath.


Pattern Analysis and Applications | 2004

A decision based one-against-one method for multi-class support vector machine

Rameswar Debnath; N. Takahide; Haruhisa Takahashi

The support vector machine (SVM) has a high generalisation ability to solve binary classification problems, but its extension to multi-class problems is still an ongoing research issue. Among the existing multi-class SVM methods, the one-against-one method is one of the most suitable methods for practical use. This paper presents a new multi-class SVM method that can reduce the number of hyperplanes of the one-against-one method and thus it returns fewer support vectors. The proposed algorithm works as follows. While producing the boundary of a class, no more hyperplanes are constructed if the discriminating hyperplanes of neighbouring classes happen to separate the rest of the classes. We present a large number of experiments that show that the training time of the proposed method is the least among the existing multi-class SVM methods. The experimental results also show that the testing time of the proposed method is less than that of the one-against-one method because of the reduction of hyperplanes and support vectors. The proposed method can resolve unclassifiable regions and alleviate the over-fitting problem in a much better way than the one-against-one method by reducing the number of hyperplanes. We also present a direct acyclic graph SVM (DAGSVM) based testing methodology that improves the testing time of the DAGSVM method.


Applied Intelligence | 2005

An Efficient Support Vector Machine Learning Method with Second-Order Cone Programming for Large-Scale Problems

Rameswar Debnath; Masakazu Muramatsu; Haruhisa Takahashi

In this paper we propose a new fast learning algorithm for the support vector machine (SVM). The proposed method is based on the technique of second-order cone programming. We reformulate the SVMs quadratic programming problem into the second-order cone programming problem. The proposed method needs to decompose the kernel matrix of SVMs optimization problem, and the decomposed matrix is used in the new optimization problem. Since the kernel matrix is positive semidefinite, the dimension of the decomposed matrix can be reduced by decomposition (factorization) methods. The performance of the proposed method depends on the dimension of the decomposed matrix. Experimental results show that the proposed method is much faster than the quadratic programming solver LOQO if the dimension of the decomposed matrix is small enough compared to that of the kernel matrix. The proposed method is also faster than the method proposed in (S. Fine and K. Scheinberg, 2001) for both low-rank and full-rank kernel matrices. The working set selection is an important issue in the SVM decomposition (chunking) method. We also modify Hsu and Lins working set selection approach to deal with large working set. The proposed approach leads to faster convergence.


international symposium on communications and information technologies | 2004

An efficient method for tuning kernel parameter of the support vector machine

Rameswar Debnath; Haruhisa Takahashi

We propose a new method for searching the kernel parameter of the support vector machine on the basis of the distribution of data in the feature space. Although the distribution (structure) of data is unknown in the feature space, it depends on the kernel parameter. The distribution of data is characterized by the principal component analysis method. Thus, simple eigenanalysis method is applied to the matrix of the same dimension as the kernel matrix to find the kernel parameter. Therefore, this method is very fast. The proposed method can obtain the kernel parameter graphically.


BioSystems | 2010

An evolutionary approach for gene selection and classification of microarray data based on SVM error-bound theories

Rameswar Debnath; Takio Kurita

Microarrays have thousands to tens-of-thousands of gene features, but only a few hundred patient samples are available. The fundamental problem in microarray data analysis is identifying genes whose disruption causes congenital or acquired disease in humans. In this paper, we propose a new evolutionary method that can efficiently select a subset of potentially informative genes for support vector machine (SVM) classifiers. The proposed evolutionary method uses SVM with a given subset of gene features to evaluate the fitness function, and new subsets of features are selected based on the estimates of generalization error of SVMs and frequency of occurrence of the features in the evolutionary approach. Thus, in theory, selected genes reflect to some extent the generalization performance of SVM classifiers. We compare our proposed method with several existing methods and find that the proposed method can obtain better classification accuracy with a smaller number of selected genes than the existing methods.


international joint conference on neural network | 2006

SVM Training: Second-Order Cone Programming versus Quadratic Programming

Rameswar Debnath; Haruhisa Takahashi

The support vector machine (SVM) problem is a convex quadratic programming problem which scales with the training data size. If the training size is large, the problem cannot be solved by straighforward methods. The large-scale SVM problems are tackled by applying chunking (decomposition) technique. The quadratic programming problem involves a square matrix which is called kernel matrix is positive semi-definite. That is, the rank of the kernel matrix is less than or equal to its size. In this paper we discuss a method that can exploit the low-rank of the kernel matrix, and an interior-point method (IPM) is efficiently applied to the global (large-sized) problem. The method is based on the technique of second-order cone programming (SOCP). This method reformulates the SVMs quadratic programming problem into the second-order cone programming problem. The SOCP method is much faster than efficient softwares SVMlightand SVMTorch if the rank of the kernel matrix is small enough compared to the training set size or if the kernel matrix can be approximated by a low-rank positive semidefinite matrix.


industrial and engineering applications of artificial intelligence and expert systems | 2002

Learning Capability: Classical RBF Network vs. SVM with Gaussian Kernel

Rameswar Debnath; Haruhisa Takahashi

The Support Vector Machine (SVM) has recently been introduced as a new learning technique for solving variety of real-world applications based on statistical learning theory. The classical Radial Basis Function (RBF) network has similar structure as SVM with Gaussian kernel. In this paper we have compared the generalization performance of RBF network and SVM in classification problems. We applied Lagrangian differential gradient method for training and pruning RBF network. RBF network shows better generalization performance and computationally faster than SVM with Gaussian kernel, specially for large training data sets.


international conference on information and communication technology | 2007

An Improved Interleaver Design for Turbo Codes

Kashif Nizam Khan; Jinat Rehana; Rameswar Debnath

This paper is aimed at designing of an effective interleaver for parallel concatenated coding schemes which utilizes the weight distribution as design criterion. We present a method here for designing an interleaver for achieving a minimum effective free distance of the code word so that error floor can be decreased. The method is expected to achieve a separation of radicN in the output code word. The proposed design has the advantage that the complexity grows linearly with the interleaver length.


international symposium on neural networks | 2004

The support vector machine learning using the second order cone programming

Rameswar Debnath; Masakazu Muramatsu; Haruhisa Takahashi

We propose a data dependent learning method for the support vector machine. This method is based on the technique of second order cone programming. We reformulate the SVM quadratic problem into the second order cone problem. The proposed method requires decomposing the kernel matrix of SVM optimization problem. In this paper we apply Cholesky decomposition method. Since the kernel matrix is positive semi definite, some columns of the decomposed matrix diminish. The performance of the proposed method depends on the reduction of dimensionality of the decomposed matrix. Computational results show that when the columns of decomposed matrix are small enough, the proposed method is much faster than the quadratic programming solver LOQO.


Information Systems | 2004

An improved working set selection method for SVM decomposition method

Rameswar Debnath; Haruhisa Takahashi

The support vector machine learning problem is a convex quadratic programming problem. For large learning tasks with many training examples, the general quadratic programs quickly become intractable in their memory and time requirements. Thus the decomposition method is essential for the support vector machine learning. The working set selection is the most important issue of the decomposition method. Convergence of problems depends on the working set selection. We propose a working set selection method that can be applicable to large working set. Experimental results on various problems show that the proposed method outperforms the existing methods.


international conference on informatics electronics and vision | 2016

An RST invariant image retrieval approach using color moments and wavelet packet entropy

S. M. Mohidul Islam; Rameswar Debnath

Content based image retrieval is a way of indexing or finding images in a database those are similar to a query image. This process uses visual contents of images and provides more effective management for automatic retrieval of images of interest than the traditional tagged based approach. In this paper, color and texture features are used as visual contents to retrieve similar images from the database. Color Moments and Wavelet Packet Entropy are applied for extracting color and texture features respectively. Then the correlation between those components of query image and images of the dataset are computed. The images, in the database, that are mostly correlated with the query image are retrieved. The experimental results show better accuracy rate and less computational cost than some existing methods for RST invariant images.

Collaboration


Dive into the Rameswar Debnath's collaboration.

Top Co-Authors

Avatar

Haruhisa Takahashi

University of Electro-Communications

View shared research outputs
Top Co-Authors

Avatar

Masakazu Muramatsu

University of Electro-Communications

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

N. Takahide

University of Electro-Communications

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge