Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Stefan Schliebs is active.

Publication


Featured researches published by Stefan Schliebs.


IEEE Transactions on Evolutionary Computation | 2009

Quantum-Inspired Evolutionary Algorithm: A Multimodel EDA

Michaël Defoin Platel; Stefan Schliebs; Nikola Kasabov

The quantum-inspired evolutionary algorithm (QEA) applies several quantum computing principles to solve optimization problems. In QEA, a population of probabilistic models of promising solutions is used to guide further exploration of the search space. This paper clearly establishes that QEA is an original algorithm that belongs to the class of estimation of distribution algorithms (EDAs), while the common points and specifics of QEA compared to other EDAs are highlighted. The behavior of a versatile QEA relatively to three classical EDAs is extensively studied and comparatively good results are reported in terms of loss of diversity, scalability, solution quality, and robustness to fitness noise. To better understand QEA, two main advantages of the multimodel approach are analyzed in details. First, it is shown that QEA can dynamically adapt the learning speed leading to a smooth and robust convergence behavior. Second, we demonstrate that QEA manipulates more complex distributions of solutions than with a single model approach leading to more efficient optimization of problems with interacting variables.


International Journal of Neural Systems | 2012

SPAN: SPIKE PATTERN ASSOCIATION NEURON FOR LEARNING SPATIO-TEMPORAL SPIKE PATTERNS

Ammar Mohemmed; Stefan Schliebs; Satoshi Matsuda; Nikola Kasabov

Spiking Neural Networks (SNN) were shown to be suitable tools for the processing of spatio-temporal information. However, due to their inherent complexity, the formulation of efficient supervised learning algorithms for SNN is difficult and remains an important problem in the research area. This article presents SPAN - a spiking neuron that is able to learn associations of arbitrary spike trains in a supervised fashion allowing the processing of spatio-temporal information encoded in the precise timing of spikes. The idea of the proposed algorithm is to transform spike trains during the learning phase into analog signals so that common mathematical operations can be performed on them. Using this conversion, it is possible to apply the well-known Widrow-Hoff rule directly to the transformed spike trains in order to adjust the synaptic weights and to achieve a desired input/output spike behavior of the neuron. In the presented experimental analysis, the proposed learning algorithm is evaluated regarding its learning capabilities, its memory capacity, its robustness to noisy stimuli and its classification performance. Differences and similarities of SPAN regarding two related algorithms, ReSuMe and Chronotron, are discussed.


Neural Networks | 2009

2009 Special Issue: Integrated feature and parameter optimization for an evolving spiking neural network: Exploring heterogeneous probabilistic models

Stefan Schliebs; Michael Defoin-Platel; Susan P. Worner; Nikola Kasabov

This study introduces a quantum-inspired spiking neural network (QiSNN) as an integrated connectionist system, in which the features and parameters of an evolving spiking neural network are optimized together with the use of a quantum-inspired evolutionary algorithm. We propose here a novel optimization method that uses different representations to explore the two search spaces: A binary representation for optimizing feature subsets and a continuous representation for evolving appropriate real-valued configurations of the spiking network. The properties and characteristics of the improved framework are studied on two different synthetic benchmark datasets. Results are compared to traditional methods, namely a multi-layer-perceptron and a naïve Bayesian classifier (NBC). A previously used real world ecological dataset on invasive species establishment prediction is revisited and new results are obtained and analyzed by an ecological expert. The proposed method results in a much faster convergence to an optimal solution (or a close to it), in a better accuracy, and in a more informative set of features selected.


congress on evolutionary computation | 2007

A versatile quantum-inspired evolutionary algorithm

M.D. Platelt; Stefan Schliebs; Nikola Kasabov

This study points out some weaknesses of existing quantum-inspired evolutionary algorithms (QEA) and explains in particular how hitchhiking phenomena can slow down the discovery of optimal solutions and encourage premature convergence. A new algorithm, called versatile quantum- inspired evolutionary algorithm (vQEA), is proposed. With vQEA, the attractors moving the population through the search space are replaced at every generation without considering their fitness. The new algorithm is much more reactive. It always adapts the search toward the last promising solution found thus leading to a smoother and more efficient exploration. In this paper, vQEA is tested and compared to a classical genetic algorithm CGA and to a QEA on several benchmark problems. Experiments have shown that vQEA performs better than both CGA and QEA in terms of speed and accuracy. It is a highly scalable algorithm as well. Finally, the properties of the vQEA are discussed and compared to estimation of distribution algorithms (EDA).


International Journal of Neural Systems | 2010

ON THE PROBABILISTIC OPTIMIZATION OF SPIKING NEURAL NETWORKS

Stefan Schliebs; Nikola Kasabov; Michael Defoin-Platel

The construction of a Spiking Neural Network (SNN), i.e. the choice of an appropriate topology and the configuration of its internal parameters, represents a great challenge for SNN based applications. Evolutionary Algorithms (EAs) offer an elegant solution for these challenges and methods capable of exploring both types of search spaces simultaneously appear to be the most promising ones. A variety of such heterogeneous optimization algorithms have emerged recently, in particular in the field of probabilistic optimization. In this paper, a literature review on heterogeneous optimization algorithms is presented and an example of probabilistic optimization of SNN is discussed in detail. The paper provides an experimental analysis of a novel Heterogeneous Multi-Model Estimation of Distribution Algorithm (hMM-EDA). First, practical guidelines for configuring the method are derived and then the performance of hMM-EDA is compared to state-of-the-art optimization algorithms. Results show hMM-EDA as a light-weight, fast and reliable optimization method that requires the configuration of only very few parameters. Its performance on a synthetic heterogeneous benchmark problem is highly competitive and suggests its suitability for the optimization of SNN.


international conference on neural information processing | 2008

Integrated feature and parameter optimization for an evolving spiking neural network

Stefan Schliebs; Michael Defoin-Platel; Nikola Kasabov

This study extends the recently proposed Evolving Spiking Neural Network (ESNN) architecture by combining it with an optimization algorithm, namely the Versatile Quantum-inspired Evolutionary Algorithm (vQEA). Following the wrapper approach, the method is used to identify relevant feature subsets and simultaneously evolve an optimal ESNN parameter setting. Applied to carefully designed benchmark data, containing irrelevant and redundant features of varying information quality, the ESNN-based feature selection procedure lead to excellent classification results and an accurate detection of relevant information in the dataset. Redundant and irrelevant features were rejected successively and in the order of the degree of information they contained.


international conference on neural information processing | 2010

Towards spatio-temporal pattern recognition using evolving spiking neural networks

Stefan Schliebs; Nuttapod Nuntalid; Nikola Kasabov

An extension of an evolving spiking neural network (eSNN) is proposed that enables the method to process spatio-temporal information. In this extension, an additional layer is added to the network architecture that transforms a spatio-temporal input pattern into a single intermediate high-dimensional network state which in turn is mapped into a desired class label using a fast one-pass learning algorithm. The intermediate state is represented by a novel probabilistic reservoir computing approach in which a stochastic neural model introduces a non-deterministic component into a liquid state machine. A proof of concept is presented demonstrating an improved separation capability of the reservoir and consequently its suitability for an eSNN extension.


international conference on artificial neural networks | 2012

Constructing robust liquid state machines to process highly variable data streams

Stefan Schliebs; Maurizio Fiasché; Nikola Kasabov

In this paper, we propose a mechanism to effectively control the overall neural activity in the reservoir of a Liquid State Machine (LSM) in order to achieve both a high sensitivity of the reservoir to weak stimuli as well as an improved resistance to over-stimulation for strong inputs. The idea is to employ a mechanism that dynamically changes the firing threshold of a neuron in dependence of its spike activity. We experimentally demonstrate that reservoirs employing this neural model significantly increase their separation capabilities. We also investigate the role of dynamic and static synapses in this context. The obtained results may be very valuable for LSM based real-world application in which the input signal is often highly variable causing problems of either too little or too much network activity.


international symposium on neural networks | 2009

Quantum-inspired feature and parameter optimisation of evolving spiking neural networks with a case study from ecological modeling

Stefan Schliebs; Michaël Defoin Platel; Sue Worner; Nikola Kasabov

The paper introduces a framework and implementation of an integrated connectionist system, where the features and the parameters of an evolving spiking neural network are optimised together using a quantum representation of the features and a quantum inspired evolutionary algorithm for optimisation. The proposed model is applied on ecological data modeling problem demonstrating a significantly better classification accuracy than traditional neural network approaches and a more appropriate feature subset selected from a larger initial number of features. Results are compared to a Naïve Bayesian Classifier.


international symposium on neural networks | 2011

Are probabilistic spiking neural networks suitable for reservoir computing

Stefan Schliebs; Ammar Mohemmed; Nikola Kasabov

This study employs networks of stochastic spiking neurons as reservoirs for liquid state machines (LSM). We experimentally investigate the separation property of these reservoirs and show their ability to generalize classes of input signals. Similar to traditional LSM, probabilistic LSM (pLSM) have the separation property enabling them to distinguish between different classes of input stimuli. Furthermore, our results indicate some potential advantages of non-deterministic LSM by improving upon the separation ability of the liquid. Three non-deterministic neural models are considered and for each of them several parameter configurations are explored. We demonstrate some of the characteristics of pLSM and compare them to their deterministic counterparts. pLSM offer more flexibility due to the probabilistic parameters resulting in a better performance for some values of these parameters.

Collaboration


Dive into the Stefan Schliebs's collaboration.

Top Co-Authors

Avatar

Nikola Kasabov

Auckland University of Technology

View shared research outputs
Top Co-Authors

Avatar

Ammar Mohemmed

Auckland University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Doug P. Hunt

Auckland University of Technology

View shared research outputs
Top Co-Authors

Avatar

Katie Smart

Auckland University of Technology

View shared research outputs
Top Co-Authors

Avatar

Mia Jüllig

University of Auckland

View shared research outputs
Top Co-Authors

Avatar

Michaël Defoin Platel

Auckland University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge