Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Siamak Mehrkanoon is active.

Publication


Featured researches published by Siamak Mehrkanoon.


IEEE Transactions on Neural Networks | 2015

Multiclass Semisupervised Learning Based Upon Kernel Spectral Clustering

Siamak Mehrkanoon; Carlos Alzate; Raghvendra Mall; Rocco Langone; Johan A. K. Suykens

This paper proposes a multiclass semisupervised learning algorithm by using kernel spectral clustering (KSC) as a core model. A regularized KSC is formulated to estimate the class memberships of data points in a semisupervised setting using the one-versus-all strategy while both labeled and unlabeled data points are present in the learning process. The propagation of the labels to a large amount of unlabeled data points is achieved by adding the regularization terms to the cost function of the KSC formulation. In other words, imposing the regularization term enforces certain desired memberships. The model is then obtained by solving a linear system in the dual. Furthermore, the optimal embedding dimension is designed for semisupervised clustering. This plays a key role when one deals with a large number of clusters.


IEEE Transactions on Neural Networks | 2012

Approximate Solutions to Ordinary Differential Equations Using Least Squares Support Vector Machines

Siamak Mehrkanoon; Tillmann Falck; Johan A. K. Suykens

In this paper, a new approach based on least squares support vector machines (LS-SVMs) is proposed for solving linear and nonlinear ordinary differential equations (ODEs). The approximate solution is presented in closed form by means of LS-SVMs, whose parameters are adjusted to minimize an appropriate error function. For the linear and nonlinear cases, these parameters are obtained by solving a system of linear and nonlinear equations, respectively. The method is well suited to solving mildly stiff, nonstiff, and singular ODEs with initial and boundary conditions. Numerical results demonstrate the efficiency of the proposed method over existing methods.


Automatica | 2012

LS-SVM approximate solution to linear time varying descriptor systems

Siamak Mehrkanoon; Johan A. K. Suykens

This paper discusses a numerical method based on Least Squares Support Vector Machines (LS-SVMs) for solving linear time varying initial and boundary value problems in Differential Algebraic Equations (DAEs). The method generates a closed form (model-based) approximate solution. The results of numerical experiments on different systems with index from 0 to 3 , are presented and compared with analytic solutions to confirm the validity and applicability of the proposed method.


Neurocomputing | 2014

Non-parallel support vector classifiers with different loss functions

Siamak Mehrkanoon; Xiaolin Huang; Johan A. K. Suykens

This paper introduces a general framework of non-parallel support vector machines, which involves a regularization term, a scatter loss and a misclassification loss. When dealing with binary problems, the framework with proper losses covers some existing non-parallel classifiers, such as multisurface proximal support vector machine via generalized eigenvalues, twin support vector machines, and its least squares version. The possibility of incorporating different existing scatter and misclassification loss functions into the general framework is discussed. Moreover, in contrast with the mentioned methods, which applies kernel-generated surface, we directly apply the kernel trick in the dual and then obtain nonparametric models. Therefore, one does not need to formulate two different primal problems for the linear and nonlinear kernel respectively. In addition, experimental results are given to illustrate the performance of different loss functions.


Neurocomputing | 2015

Learning solutions to partial differential equations using LS-SVM

Siamak Mehrkanoon; Johan A. K. Suykens

This paper proposes an approach based on Least Squares Support Vector Machines (LS-SVMs) for solving second order partial differential equations (PDEs) with variable coefficients. Contrary to most existing techniques, the proposed method provides a closed form approximate solution. The optimal representation of the solution is obtained in the primal-dual setting. The model is built by incorporating the initial/boundary conditions as constraints of an optimization problem. The developed method is well suited for problems involving singular, variable and constant coefficients as well as problems with irregular geometrical domains. Numerical results for linear and nonlinear PDEs demonstrate the efficiency of the proposed method over existing methods.


Neurocomputing | 2013

Support vector machines with piecewise linear feature mapping

Xiaolin Huang; Siamak Mehrkanoon; Johan A. K. Suykens

Abstract As the simplest extension to linear classifiers, piecewise linear (PWL) classifiers have attracted a lot of attention, because of their simplicity and classification capability. In this paper, a PWL feature mapping is introduced by investigating the property of the PWL classification boundary. Then support vector machines (SVM) with PWL feature mappings are proposed, called PWL-SVMs. In this paper, it is shown that some widely used PWL classifiers, such as k-nearest-neighbor, adaptive boosting of linear classifier and intersection kernel support vector machine, can be represented by the proposed feature mapping. That means the proposed PWL-SVMs at least can achieve the performance of the above PWL classifiers. Moreover, PWL-SVMs enjoy good properties of SVM and the performance on numerical experiments illustrates the effectiveness. Then some extensions are discussed and the application of PWL-SVMs can be expected.


international symposium on neural networks | 2013

Non-parallel semi-supervised classification based on kernel spectral clustering

Siamak Mehrkanoon; Johan A. K. Suykens

In this paper, a non-parallel semi-supervised algorithm based on kernel spectral clustering is formulated. The prior knowledge about the labels is incorporated into the kernel spectral clustering formulation via adding regularization terms. In contrast with the existing multi-plane classifiers such as Multisurface Proximal Support Vector Machine (GEPSVM) and Twin Support Vector Machines (TWSVM) and its least squares version (LSTSVM) we will not use a kernel-generated surface. Instead we apply the kernel trick in the dual. Therefore as opposed to conventional non-parallel classifiers one does not need to formulate two different primal problems for the linear and nonlinear case separately. The proposed method will generate two non-parallel hyperplanes which then are used for out-of-sample extension. Experimental results demonstrate the efficiency of the proposed method over existing methods.


international symposium on neural networks | 2014

Large scale semi-supervised learning using KSC based model

Siamak Mehrkanoon; Johan A. K. Suykens

Often in practice one deals with a large amount of unlabeled data, while the fraction of labeled data points will typically be small. Therefore one prefers to apply a semi-supervised algorithm, which uses both labeled and unlabeled data points in the learning process, to have a better performance. Considering the large amount of unlabeled data, making a semi-supervised algorithm scalable is an important task. In this paper we adopt a recently proposed multi-class semi-supervised KSC based algorithm (MSS-KSC) and make it scalable by means of two different approaches. The first one is based on the Nyström approximation method which provides a finite dimensional feature map that can then be used to solve the optimization problem in the primal. The second approach is based on the reduced kernel technique that solves the problem in the dual by reducing the dimensionality of the kernel matrix to a rectangular kernel. Experimental results demonstrate the scalability and efficiency of the proposed approaches on real datasets.


Neural Networks | 2015

Incremental multi-class semi-supervised clustering regularized by Kalman filtering

Siamak Mehrkanoon; Oscar Mauricio Agudelo; Johan A. K. Suykens

This paper introduces an on-line semi-supervised learning algorithm formulated as a regularized kernel spectral clustering (KSC) approach. We consider the case where new data arrive sequentially but only a small fraction of it is labeled. The available labeled data act as prototypes and help to improve the performance of the algorithm to estimate the labels of the unlabeled data points. We adopt a recently proposed multi-class semi-supervised KSC based algorithm (MSS-KSC) and make it applicable for on-line data clustering. Given a few user-labeled data points the initial model is learned and then the class membership of the remaining data points in the current and subsequent time instants are estimated and propagated in an on-line fashion. The update of the memberships is carried out mainly using the out-of-sample extension property of the model. Initially the algorithm is tested on computer-generated data sets, then we show that video segmentation can be cast as a semi-supervised learning problem. Furthermore we show how the tracking capabilities of the Kalman filter can be used to provide the labels of objects in motion and thus regularizing the solution obtained by the MSS-KSC algorithm. In the experiments, we demonstrate the performance of the proposed method on synthetic data sets and real-life videos where the clusters evolve in a smooth fashion over time.


IFAC Proceedings Volumes | 2012

Parameter Estimation for Time Varying Dynamical Systems using Least Squares Support Vector Machines

Siamak Mehrkanoon; Tillmann Falck; Johan A. K. Suykens

Abstract This paper develops a new approach based on Least Squares Support Vector Machines (LS-SVMs) for parameter estimation of time invariant as well as time varying dynamical SISO systems. Closed-form approximate models for the state and its derivative are first derived from the observed data by means of LS-SVMs. The time-derivative information is then substituted into the system of ODEs, converting the parameter estimation problem into an algebraic optimization problem. In the case of time invariant systems one can use least-squares to solve the obtained system of algebraic equations. The estimation of time-varying coefficients in SISO models, is obtained by assuming an LS-SVM model for it.

Collaboration


Dive into the Siamak Mehrkanoon's collaboration.

Top Co-Authors

Avatar

Johan A. K. Suykens

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Raghvendra Mall

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Xiaolin Huang

Shanghai Jiao Tong University

View shared research outputs
Top Co-Authors

Avatar

Rocco Langone

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Carlos Alzate

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Oscar Mauricio Agudelo

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Tillmann Falck

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Yuning Yang

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Steven X. Ding

University of Duisburg-Essen

View shared research outputs
Top Co-Authors

Avatar

Yuri A. W. Shardt

University of Duisburg-Essen

View shared research outputs
Researchain Logo
Decentralizing Knowledge