Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Kim Batselier is active.

Publication


Featured researches published by Kim Batselier.


IFAC Proceedings Volumes | 2012

Back to the Roots: Polynomial System Solving, Linear Algebra, Systems Theory

Philippe Dreesen; Kim Batselier; Bart De Moor

Abstract Multivariate polynomial system solving and polynomial optimization problems arise as central problems in many systems theory, identification and control settings. Traditionally, methods for solving polynomial equations have been developed in the area of algebraic geometry. Although a large body of literature is available, it is known as one of the most inaccessible fields of mathematics. In this paper we present a method for solving systems of polynomial equations employing numerical linear algebra and systems theory tools only, such as realization theory, SVD/QR, and eigenvalue computations. The task at hand is translated into the realm of linear algebra by separating coefficients and monomials into a coefficient matrix multiplied with a basis of monomials. Applying realization theory to the structure in the monomial basis allows to find all solutions of the system from eigenvalue computations. Solving a polynomial optimization problem is shown to be equivalent to an extremal eigenvalue problem. Relevant applications are found in identification and control, such as the global optimization of structured total least squares problems.


SIAM Journal on Matrix Analysis and Applications | 2013

The Geometry of Multivariate Polynomial Division and Elimination

Kim Batselier; Philippe Dreesen; Bart De Moor

Multivariate polynomials are usually discussed in the framework of algebraic geometry. Solving problems in algebraic geometry usually involves the use of a Grobner basis. This article shows that linear algebra without any Grobner basis computation suffices to solve basic problems from algebraic geometry by describing three operations: multiplication, division, and elimination. This linear algebra framework will also allow us to give a geometric interpretation. Multivariate division will involve oblique projections, and a link between elimination and principal angles between subspaces (CS decomposition) is revealed. The main computational tool in this approach is the QR decomposition.


Automatica | 2017

Tensor Network alternating linear scheme for MIMO Volterra system identification

Kim Batselier; Zhongming Chen; Ngai Wong

This article introduces two Tensor Network-based iterative algorithms for the identification of high-order discrete-time nonlinear multiple-input multiple-output (MIMO) Volterra systems. The system identification problem is rewritten in terms of a Volterra tensor, which is never explicitly constructed, thus avoiding the curse of dimensionality. It is shown how each iteration of the two identification algorithms involves solving a linear system of low computational complexity. The proposed algorithms are guaranteed to monotonically converge and numerical stability is ensured through the use of orthogonal matrix factorizations. The performance and accuracy of the two identification algorithms are illustrated by numerical experiments, where accurate degree-10 MIMO Volterra models are identified in about 1 second in Matlab on a standard desktop pc.


SIAM Journal on Matrix Analysis and Applications | 2015

A Constructive Algorithm for Decomposing a Tensor into a Finite Sum of Orthonormal Rank-1 Terms

Kim Batselier; Haotian Liu; Ngai Wong

We propose a constructive algorithm that decomposes an arbitrary real tensor into a finite sum of orthonormal rank-1 outer products. The algorithm, called TTr1SVD, works by converting the tensor into a tensor-train rank-1 (TTr1) series via the singular value decomposition (SVD). TTr1SVD naturally generalizes the SVD to the tensor regime with properties such as uniqueness for a fixed order of indices, orthogonal rank-1 outer product terms, and easy truncation error quantification. Using an outer product column table it also allows, for the first time, a complete characterization of all tensors orthogonal with the original tensor. Incidentally, this leads to a strikingly simple constructive proof showing that the maximum rank of a real


IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems | 2017

Tensor Computation: A New Framework for High-Dimensional Problems in EDA

Zheng Zhang; Kim Batselier; Haotian Liu; Luca Daniel; Ngai Wong

2 \times 2 \times 2


Journal of Computational and Applied Mathematics | 2016

Symmetric tensor decomposition by an iterative eigendecomposition algorithm

Kim Batselier; Ngai Wong

tensor over the real field is 3. We also derive a conversion of the TTr1 decomposition into a Tucker decomposition with a sparse core tensor. Numerical examples illustrate each of the favorable properties of the TTr1 decomposition.


IFAC Proceedings Volumes | 2012

Prediction Error Method Identification is an Eigenvalue Problem

Kim Batselier; Philippe Dreesen; Bart De Moor

Many critical electronic design automation (EDA) problems suffer from the curse of dimensionality, i.e., the very fast-scaling computational burden produced by large number of parameters and/or unknown variables. This phenomenon may be caused by multiple spatial or temporal factors (e.g., 3-D field solvers discretizations and multirate circuit simulation), nonlinearity of devices and circuits, large number of design or optimization parameters (e.g., full-chip routing/placement and circuit sizing), or extensive process variations (e.g., variability /reliability analysis and design for manufacturability). The computational challenges generated by such high-dimensional problems are generally hard to handle efficiently with traditional EDA core algorithms that are based on matrix and vector computation. This paper presents “tensor computation” as an alternative general framework for the development of efficient EDA algorithms and tools. A tensor is a high-dimensional generalization of a matrix and a vector, and is a natural choice for both storing and solving efficiently high-dimensional EDA problems. This paper gives a basic tutorial on tensors, demonstrates some recent examples of EDA applications (e.g., nonlinear circuit modeling and high-dimensional uncertainty quantification), and suggests further open EDA problems where the use of tensor computation could be of advantage.


Numerical Linear Algebra With Applications | 2017

A constructive arbitrary-degree Kronecker product decomposition of tensors

Kim Batselier; Ngai Wong

We present an iterative algorithm, called the symmetric tensor eigen-rank-one iterative decomposition (STEROID), for decomposing a symmetric tensor into a real linear combination of symmetric rank-1 unit-norm outer factors using only eigendecompositions and least-squares fitting. Originally designed for a symmetric tensor with an order being a power of two, STEROID is shown to be applicable to any order through an innovative tensor embedding technique. Numerical examples demonstrate the high efficiency and accuracy of the proposed scheme even for large scale problems. Furthermore, we show how STEROID readily solves a problem in nonlinear block-structured system identification and nonlinear state-space identification.


SIAM Journal on Matrix Analysis and Applications | 2014

The Canonical Decomposition of

Kim Batselier; Philippe Dreesen; Bart De Moor

Abstract This article explores the link between prediction error methods, nonlinear polynomial systems and generalized eigenvalue problems. It is shown how the global minimum of the sum of squared prediction errors can be found from solving a nonlinear polynomial system. An algorithm is provided that effectively counts the number of affine solutions of the nonlinear polynomial system and determines these solutions by solving a generalized eigenvalue problem. The proposed method is illustrated by means of an example.


international conference on computer aided design | 2016

\mathcal{C}^n_d

Kim Batselier; Zhongming Chen; Haotian Liu; Ngai Wong

We propose the tensor Kronecker product singular value decomposition~(TKPSVD) that decomposes a real

Collaboration


Dive into the Kim Batselier's collaboration.

Top Co-Authors

Avatar

Ngai Wong

University of Hong Kong

View shared research outputs
Top Co-Authors

Avatar

Bart De Moor

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Philippe Dreesen

Vrije Universiteit Brussel

View shared research outputs
Top Co-Authors

Avatar

Haotian Liu

University of Hong Kong

View shared research outputs
Top Co-Authors

Avatar

Zhongming Chen

Hangzhou Dianzi University

View shared research outputs
Top Co-Authors

Avatar

Luca Daniel

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Cong Chen

University of Hong Kong

View shared research outputs
Top Co-Authors

Avatar

Johan A. K. Suykens

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Ching-Yun Ko

University of Hong Kong

View shared research outputs
Top Co-Authors

Avatar

Jian Deng

University of Hong Kong

View shared research outputs
Researchain Logo
Decentralizing Knowledge