Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Norikazu Takahashi is active.

Publication


Featured researches published by Norikazu Takahashi.


IEEE Transactions on Circuits and Systems I-regular Papers | 2000

A new sufficient condition for complete stability of cellular neural networks with delay

Norikazu Takahashi

This paper gives a new sufficient condition for cellular neural networks with delay (DCNNs) to be completely stable. A fixed-point theorem and a convergence theorem of the Gauss-Seidel method play important roles in the proof, while most conventional stability criteria were obtained by constructing Lyapunov functionals.


IEEE Transactions on Circuits and Systems I-regular Papers | 1998

On the complete stability of nonsymmetric cellular neural networks

Norikazu Takahashi; Leon O. Chua

This paper gives a new sufficient condition for complete stability of a nonsymmetric cellular neural network (CNN). The convergence theorem of the Gauss-Seidel method, which is an iterative technique for solving a linear algebraic equation, plays an important role in our proof. It is also shown that the existence of a stable equilibrium point does not imply complete stability of a nonsymmetric CNN.


IEEE Transactions on Neural Networks | 2005

Rigorous proof of termination of SMO algorithm for support vector Machines

Norikazu Takahashi; Tetsuo Nishi

Sequential minimal optimization (SMO) algorithm is one of the simplest decomposition methods for learning of support vector machines (SVMs). Keerthi and Gilbert have recently studied the convergence property of SMO algorithm and given a proof that SMO algorithm always stops within a finite number of iterations. In this letter, we point out the incompleteness of their proof and give a more rigorous proof.


IEEE Transactions on Neural Networks | 2008

Global Convergence of SMO Algorithm for Support Vector Regression

Norikazu Takahashi; Jun Guo; Tetsuo Nishi

Global convergence of the sequential minimal optimization (SMO) algorithm for support vector regression (SVR) is studied in this paper. Given l training samples, SVR is formulated as a convex quadratic programming (QP) problem with l pairs of variables. We prove that if two pairs of variables violating the optimality condition are chosen for update in each step and subproblems are solved in a certain way, then the SMO algorithm always stops within a finite number of iterations after finding an optimal solution. Also, efficient implementation techniques for the SMO algorithm are presented and compared experimentally with other SMO algorithms.


IEEE Transactions on Circuits and Systems I-regular Papers | 1997

A new sufficient condition for nonsymmetric CNNs to have a stable equilibrium point

Norikazu Takahashi; Leon O. Chua

This letter gives a new sufficient condition for nonsymmetric CNNs to have at least one stable equilibrium point. Existence of a stable equilibrium point is important for nonsymmetric CNNs because it is a necessary condition for complete stability. It is shown that our sufficient condition is a generalization of a previous result concerning the existence of a stable equilibrium point, and that it can easily be applied to space invariant CNNs with a 3/spl times/3 neighborhood.


IEEE Transactions on Neural Networks | 2006

Global Convergence of Decomposition Learning Methods for Support Vector Machines

Norikazu Takahashi; Tetsuo Nishi

Decomposition methods are well-known techniques for solving quadratic programming (QP) problems arising in support vector machines (SVMs). In each iteration of a decomposition method, a small number of variables are selected and a QP problem with only the selected variables is solved. Since large matrix computations are not required, decomposition methods are applicable to large QP problems. In this paper, we will make a rigorous analysis of the global convergence of general decomposition methods for SVMs. We first introduce a relaxed version of the optimality condition for the QP problems and then prove that a decomposition method reaches a solution satisfying this relaxed optimality condition within a finite number of iterations under a very mild condition on how to select variables


IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences | 2006

An Efficient Method for Simplifying Decision Functions of Support Vector Machines

Jun Guo; Norikazu Takahashi; Tetsuo Nishi

A novel method to simplify decision functions of support vector machines (SVMs) is proposed in this paper. In our method, a decision function is determined first in a usual way by using all training samples. Next those support vectors which contribute less to the decision function are excluded from the training samples. Finally a new decision function is obtained by using the remaining samples. Experimental results show that the proposed method can effectively simplify decision functions of SVMs without reducing the generalization capability.


Computational Optimization and Applications | 2014

Global convergence of modified multiplicative updates for nonnegative matrix factorization

Norikazu Takahashi; Ryota Hibi

Nonnegative matrix factorization (NMF) is the problem of approximating a given nonnegative matrix by the product of two nonnegative matrices. The multiplicative updates proposed by Lee and Seung are widely used as efficient computational methods for NMF. However, the global convergence of these updates is not formally guaranteed because they are not defined for all pairs of nonnegative matrices. In this paper, we consider slightly modified versions of the original multiplicative updates and study their global convergence properties. The only difference between the modified updates and the original ones is that the former do not allow variables to take values less than a user-specified positive constant. Using Zangwill’s global convergence theorem, we prove that any sequence of solutions generated by either of those modified updates has at least one convergent subsequence and the limit of any convergent subsequence is a stationary point of the corresponding optimization problem. Furthermore, we propose algorithms based on the modified updates that always stop within a finite number of iterations.


international conference on advanced computer theory and engineering | 2008

An Efficient Algorithm for Multi-class Support Vector Machines

Jun Guo; Norikazu Takahashi; Wenxin Hu

A novel algorithm for multi-class support vector machines (SVMs) is proposed in this paper. The tree constructed in our algorithm consists of a series of two-class SVMs. Considering both separability and balance, in each iteration multi-class patterns are divided into two sets according to the distances between pairwise classes and the number of patterns in each class. This algorithm can well treat with the unequally distributed problems. The efficiency of the proposed method are verified by the experimental results.


european conference on circuit theory and design | 2005

A learning algorithm for improving the classification speed of support vector machines

Jun Guo; Norikazu Takahashi; Tetsuo Nishi

A novel method for training support vector machines (SVMs) is proposed to speed up the SVMs in test phase. It has three main steps. First, an SVM is trained on all the training samples, thereby producing a number of support vectors. Second, the support vectors, which contribute less to the shape of the decision surface, are excluded from the training set. Finally, the SVM is re-trained only on the remaining samples. Compared to the initially trained SVM, the efficiency of the finally trained SVM is highly improved, without system degradation.

Collaboration


Dive into the Norikazu Takahashi's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jun Guo

East China Normal University

View shared research outputs
Top Co-Authors

Avatar

Hajime Hara

Hiroshima Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge