Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jonathan Shapiro is active.

Publication


Featured researches published by Jonathan Shapiro.


Neural Networks | 2002

A self-organising network that grows when required

Stephen Marsland; Jonathan Shapiro; Ulrich Nehmzow

The ability to grow extra nodes is a potentially useful facility for a self-organising neural network. A network that can add nodes into its map space can approximate the input space more accurately, and often more parsimoniously, than a network with predefined structure and size, such as the Self-Organising Map. In addition, a growing network can deal with dynamic input distributions. Most of the growing networks that have been proposed in the literature add new nodes to support the node that has accumulated the highest error during previous iterations or to support topological structures. This usually means that new nodes are added only when the number of iterations is an integer multiple of some pre-defined constant, A. This paper suggests a way in which the learning algorithm can add nodes whenever the network in its current state does not sufficiently match the input. In this way the network grows very quickly when new data is presented, but stops growing once the network has matched the data. This is particularly important when we consider dynamic data sets, where the distribution of inputs can change to a new regime after some time. We also demonstrate the preservation of neighbourhood relations in the data by the network. The new network is compared to an existing growing network, the Growing Neural Gas (GNG), on a artificial dataset, showing how the network deals with a change in input distribution after some time. Finally, the new network is applied to several novelty detection tasks and is compared with both the GNG and an unsupervised form of the Reduced Coulomb Energy network on a robotic inspection task and with a Support Vector Machine on two benchmark novelty detection tasks.


Autonomous Robots | 2002

Fast, On-Line Learning of Globally Consistent Maps

Tom Duckett; Stephen Marsland; Jonathan Shapiro

To navigate in unknown environments, mobile robots require the ability to build their own maps. A major problem for robot map building is that odometry-based dead reckoning cannot be used to assign accurate global position information to a map because of cumulative drift errors. This paper introduces a fast, on-line algorithm for learning geometrically consistent maps using only local metric information. The algorithm works by using a relaxation technique to minimize an energy function over many small steps. The approach differs from previous work in that it is computationally cheap, easy to implement and is proven to converge to a globally optimal solution. Experiments are presented in which large, complex environments were successfully mapped by a real robot.


international conference on robotics and automation | 2000

Learning globally consistent maps by relaxation

Tom Duckett; Stephen Marsland; Jonathan Shapiro

Mobile robots require the ability to build their own maps to operate in unknown environments. A fundamental problem is that odometry-based dead reckoning cannot be used to assign accurate global position information to a map because of drift errors caused by wheel slippage. The paper introduces a fast, online method of learning globally consistent maps, using only local metric information. The approach differs from previous work in that it is computationally cheap, easy to implement and is guaranteed to find a globally optimal solution. Experiments are presented in which large, complex environments were successfully mapped by a real robot, and quantitative performance measures are used to assess the quality of the maps obtained.


international symposium on physical design | 1997

The dynamics of a genetic algorithm for simple random Ising systems

Adam Prügel-Bennett; Jonathan Shapiro

Abstract A formalism is presented for analysing Genetic Algorithms. It is used to study a simple Genetic Algorithm consisting of selection, mutation and crossover which is searching for the ground states of simple random Ising-spin systems: a random-field ideal paramagnet and a spin-glass chain. The formalism can also be applied to other population based search techniques and to biological models of micro-evolution. To make the problem tractable, it is assumed that the population dynamics can be described by a few macroscopic order parameters and that the remaining microscopic degrees of freedom can be averaged out. The macroscopic quantities that are used are the cumulants of the distribution of fitnesses (or energies) in the population. A statistical mechanics model is presented which describes the population configuration in terms of the cumulants, this is used to derive equations of motion for the cumulants. Predictions of the theory are compared with experiments and are shown to predict the average time to convergence and the average fitness of the final population accurately. A simplified version of the equations is produced by keeping only leading nonlinear terms, and truncating the cumulant expansion. This is shown to give a novel description of the role of genetic operators in search, e.g. it is argued that an important role of crossover is to reduce the skewness of the population.


Robotics and Autonomous Systems | 2005

On-line Novelty Detection for Autonomous Mobile Robots

Stephen Marsland; Ulrich Nehmzow; Jonathan Shapiro

The use of mobile robots for inspection tasks is an attractive idea. A robot can travel through environments that humans cannot, and can be trained to identify sensor perceptions that signify potential or actual problems without requiring human intervention. However, in many cases, the appearance of a problem can vary widely, and ensuring that the robot does not miss any possible appearance of the problem (false negatives) is virtually impossible using conventional methods. This paper presents an alternative methodology using novelty detection. A neural network is trained to ignore normal perceptions that do not suggest any problems, so that anything that the robot has not sensed before is highlighted as a possible fault. This makes the incidence of false negatives less likely. We propose a novelty filter that can operate on-line, so that each new input is evaluated for novelty with respect to the data seen so far. The novelty filter learns to ignore inputs that have been sensed previously, or where similar inputs have been perceived. We demonstrate the use of the novelty filter on a series of simple inspection tasks using a mobile robot. The robot highlights those parts of an environment that are novel in some way, that is they are not part of the model acquired during exploration of a different environment. We show the effectiveness of the method using inputs from both sonar sensors and a monochrome camera.


electronic commerce | 2005

Drift and Scaling in Estimation of Distribution Algorithms

Jonathan Shapiro

This paper considers a phenomenon in Estimation of Distribution Algorithms (EDA) analogous to drift in population genetic dynamics. Finite population sampling in selection results in fluctuations which get reinforced when the probability model is updated. As a consequence, any probability model which can generate only a single set of values with probability 1 can be an attractive fixed point of the algorithm. To avoid this, parameters of the algorithm must scale with the system size in strongly problem-dependent ways, or the algorithm must be modified. This phenomenon is shown to hold for general EDAs as a consequence of the lack of ergodicity and irreducibility of the Markov chain on the state of probability models. It is illustrated in the case of UMDA, in which it is shown that the global optimum is only found if the population size is sufficiently large. For the needle-in-a haystack problem, the population size must scale as the square-root of the size of the search space. For the one-max problem, the population size must scale as the square-root of the problem size.


artificial intelligence and the simulation of behaviour | 1994

A Statistical Mechanical Formulation of the Dynamics of Genetic Algorithms

Jonathan Shapiro; Adam Prügel-Bennett; Magnus Rattray

A new mathematical description of the dynamics of a simple genetic algorithm is presented. This formulation is based on ideas from statistical physics. Rather than trying to predict what happens to each individual member of the population, methods of statistical mechanics are used to describe the evolution of statistical properties of the population. We present equations which predict these properties at one generation in terms of those at the previous generation. The effect of the selection operator is shown to depend only on the distribution of fitnesses within the population, and is otherwise problem independent. We predict an optimal form of selection scaling and compare it with linear scaling. Crossover and mutation are problem-dependent, and are discussed in terms of a test problem — the search for the low energy states of a random spin chain. The theory is shown to be in good agreement with simulations.


parallel problem solving from nature | 2006

Diversity loss in general estimation of distribution algorithms

Jonathan Shapiro

A very general class of EDAs is defined, on which universal results on the rate of diversity loss can be derived. This EDA class, denoted SML-EDA, requires two restrictions: 1) in each generation, the new probability model is build using only data sampled from the current probability model; and 2) maximum likelihood is used to set model parameters. This class is very general; it includes simple forms of many well-known EDAs, e.g. BOA, MIMIC, FDA, UMDA, etc. To study the diversity loss in SML-EDAs, the trace of the empirical covariance matrix is the proposed statistic. Two simple results are derived. Let N be the number of data vectors evaluated in each generation. It is shown that on a flat landscape, the expected value of the statistic decreases by a factor 1–1/N in each generation. This result is used to show that for the Needle problem, the algorithm will with a high probability never find the optimum unless the population size grows exponentially in the number of search variables.


Journal of Physics A | 1996

The dynamics of a genetic algorithm for a simple learning problem

Magnus Rattray; Jonathan Shapiro

A formalism for describing the dynamics of Genetic Algorithms (GAs) using method s from statistical mechanics is applied to the problem of generalization in a perceptron with binary weights. The dynamics are solved for the case where a new batch of training patterns is presented to each population member each generation, which considerably simplifies the calculation. The theory is shown to agree closely to simulations of a real GA averaged over many runs, accurately predicting the mean best solution found. For weak selection and large problem size the difference equations describing the dynamics can be expressed analytically and we find that the effects of noise due to the finite size of each training batch can be removed by increasing the population size appropriately. If this population resizing is used, one can deduce the most computationally efficient size of training batch each generation. For independent patterns this choice also gives the minimum total number of training patterns used. Although using independent patterns is a very inefficient use of training patterns in general, this work may also prove useful for determining the optimum batch size in the case where patterns are recycled.


Theoretical aspects of evolutionary computing | 2001

Statistical mechanics theory of genetic algorithms

Jonathan Shapiro

This tutorial gives an introduction to the statistical mechanics method of analysing genetic algorithm (GA) dynamics. The goals are to study GAs acting on specific problems which include realistic features such as: finite population effects, crossover, large search spaces, and realistic cost functions. Statistical mechanics allows one to derive deterministic equations of motion which describe average quantities of the population after selection, mutation, and crossover in terms of those before. The general ideas of this approach are described here, and some details given via consideration of a specific problem. Finally, a description of the literature is given.

Collaboration


Dive into the Jonathan Shapiro's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Magnus Rattray

University of Manchester

View shared research outputs
Top Co-Authors

Avatar

Nicola Gale

University of Birmingham

View shared research outputs
Top Co-Authors

Avatar

Joy Bose

University of Manchester

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Hao Wu

University of Manchester

View shared research outputs
Top Co-Authors

Avatar

Peter Frank

Royal College of General Practitioners

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge