Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Bart Baddeley is active.

Publication


Featured researches published by Bart Baddeley.


PLOS Computational Biology | 2012

A model of ant route navigation driven by scene familiarity

Bart Baddeley; Paul Graham; Philip Husbands; Andrew Philippides

In this paper we propose a model of visually guided route navigation in ants that captures the known properties of real behaviour whilst retaining mechanistic simplicity and thus biological plausibility. For an ant, the coupling of movement and viewing direction means that a familiar view specifies a familiar direction of movement. Since the views experienced along a habitual route will be more familiar, route navigation can be re-cast as a search for familiar views. This search can be performed with a simple scanning routine, a behaviour that ants have been observed to perform. We test this proposed route navigation strategy in simulation, by learning a series of routes through visually cluttered environments consisting of objects that are only distinguishable as silhouettes against the sky. In the first instance we determine view familiarity by exhaustive comparison with the set of views experienced during training. In further experiments we train an artificial neural network to perform familiarity discrimination using the training views. Our results indicate that, not only is the approach successful, but also that the routes that are learnt show many of the characteristics of the routes of desert ants. As such, we believe the model represents the only detailed and complete model of insect route guidance to date. What is more, the model provides a general demonstration that visually guided routes can be produced with parsimonious mechanisms that do not specify when or what to learn, nor separate routes into sequences of waypoints.


Current Biology | 2010

Animal Cognition: Multi-modal Interactions in Ant Learning

Paul Graham; Andrew Philippides; Bart Baddeley

A recent study shows that desert ants use a precise behaviour, based on the internal cues of path integration, to facilitate the learning of visual landmark information. This raises fascinating questions about how insects encode familiar terrain.


Adaptive Behavior | 2011

Holistic visual encoding of ant-like routes: Navigation without waypoints

Bart Baddeley; Paul Graham; Andrew Philippides; Philip Husbands

It is known that ants learn long visually guided routes through complex terrain. However, the mechanisms by which visual information is first learned and then used to control a route direction are not well understood. In this article, we propose a parsimonious mechanism for visually guided route following. We investigate whether a simple approach, involving scanning the environment and moving in the direction that appears most familiar, can provide a model of visually guided route learning in ants. We implement view familiarity as a means of navigation by training a classifier to determine whether a given view is part of a route and using the confidence in this classification as a proxy for familiarity. Through the coupling of movement and viewing direction, a familiar view specifies a familiar direction of viewing and thus a familiar movement to make. We show the feasibility of our approach as a model of ant-like route acquisition by learning a series of nontrivial routes through an indoor environment using a large gantry robot equipped with a panoramic camera.


Adaptive Behavior | 2007

Linked Local Navigation for Visual Route Guidance

Lincoln Smith; Andrew Philippides; Paul Graham; Bart Baddeley; Philip Husbands

Insects are able to navigate reliably between food and nest using only visual information. This behavior has inspired many models of visual landmark guidance, some of which have been tested on autonomous robots. The majority of these models work by comparing the agents current view with a view of the world stored when the agent was at the goal. The region from which agents can successfully reach home is therefore limited to the goals visual locale, that is, the area around the goal where the visual scene is not radically different to the goal position. Ants are known to navigate over large distances using visually guided routes consisting of a series of visual memories. Taking inspiration from such route navigation, we propose a framework for linking together local navigation methods. We implement this framework on a robotic platform and test it in a series of environments in which local navigation methods fail. Finally, we show that the framework is robust to environments of varying complexity.


BMC Neuroscience | 2012

A neural network based holistic model of ant route navigation

Bart Baddeley; Paul Graham; Philip Husbands; Andrew Philippides

The impressive ability of social insects to learn long foraging routes guided by visual information [1] provides proof that robust spatial behaviour can be produced with limited neural resources [2,3]. As such, social insects have become an important model system for understanding the minimal cognitive requirements for navigation [1]. This is a goal shared by biomimetic engineers and those studying animal cognition using a bottom-up approach to the understanding of natural intelligence [4]. Models of visual navigation that have been successful in replicating place homing are dominated by snapshot-type models where a single view of the world as memorized from the goal location is compared to the current view in order to drive a search for the goal [5], for review, see [6]. Snapshot approaches only allow for navigation in the immediate vicinity of the goal however, and do not achieve robust route navigation over longer distances [7]. Here we present a parsimonious model of visually guided route learning that addresses this issue [8]. We test this proposed route navigation strategy in simulation, by learning a series of routes through visually cluttered environments consisting of objects that are only distinguishable as silhouettes against the sky. Our navigation algorithm consists of two phases. The ant first traverses the route using a combination of path integration and obstacle avoidance during which the views used to learn the route are experienced. Subsequently, the ant navigates by visually scanning the world – a behaviour observed in ants in the field – and moving in the direction which is deemed most familiar. As proof of concept, we first determine view familiarity by exhaustive comparison with the set of views experienced during training. In subsequent experiments we train an artificial neural network to perform familiarity discrimination using the training views via the InfoMax algorithm [9]. By utilising the interaction of sensori-motor constraints and observed innate behaviours we show that it is possible to produce robust behaviour using a learnt holistic representation of a route. Furthermore, we show that the model captures the known properties of route navigation in desert ants. These include the ability to learn a route after a single training run and the ability to learn multiple idiosyncratic routes to a single goal. Importantly, navigation is independent of odometric or compass information, does not specify when or what to learn nor separate the routes into sequences of waypoints, so providing proof of concept that route navigation can be achieved without these elements. The algorithm also exhibits both place-search and route navigation with the same mechanism. As such, we believe the model represents the only detailed and complete model of insect route guidance to date.


international conference on intelligent robotics and applications | 2011

Models of visually guided routes in ants: embodiment simplifies route acquisition

Bart Baddeley; Paul Graham; Andrew Philippides; Philip Husbands

It is known that ants learn long visually-guided routes through complex terrain. However, the mechanisms by which visual information is first learnt and then used to control a route direction are not well understood. In this paper we investigate whether a simple approach, involving scanning the environment and moving in the direction that appears most familiar, can provide a model of visually guided route learning in ants. The specific embodiment of an ants visual system means that movement and viewing direction are tightly coupled, a familiar view specifies a familiar direction of viewing and thus a familiar movement to make. We show the feasibility of our approach as a model of ant-like route acquisition by learning non-trivial routes through a simulated environment firstly using the complete set of views experienced during learning and secondly using an approximation to the distribution of these views.


european conference on artificial life | 2007

Improving agent localisation through stereotypical motion

Bart Baddeley; Andrew Philippides

When bees and wasps leave the nest to forage, they perform orientation or learning flights. This behaviour includes a number of stereotyped flight manoeuvres mediating the active acquisition of visual information. If we assume that the bee is attempting to localise itself in the world with reference to stable visual landmarks, then we can model the orientation flight as a probabilistic Simultaneous Localisation And Mapping (SLAM) problem. Within this framework, one effect of stereotypical behaviour could be to make the agents own movements easier to predict. In turn, leading to better localisation and mapping performance. We describe a probabilistic framework for building quantitative models of orientation flights and investigate what benefits a more reliable movement model would have for an agents visual learning.


Methods of Molecular Biology | 2015

Using neural networks to understand the information that guides behavior: a case study in visual navigation

Andrew Philippides; Paul Graham; Bart Baddeley; Philip Husbands

To behave in a robust and adaptive way, animals must extract task-relevant sensory information efficiently. One way to understand how they achieve this is to explore regularities within the information animals perceive during natural behavior. In this chapter, we describe how we have used artificial neural networks (ANNs) to explore efficiencies in vision and memory that might underpin visually guided route navigation in complex worlds. Specifically, we use three types of neural network to learn the regularities within a series of views encountered during a single route traversal (the training route), in such a way that the networks output the familiarity of novel views presented to them. The problem of navigation is then reframed in terms of a search for familiar views, that is, views similar to those associated with the route. This approach has two major benefits. First, the ANN provides a compact holistic representation of the data and is thus an efficient way to encode a large set of views. Second, as we do not store the training views, we are not limited in the number of training views we use and the agent does not need to decide which views to learn.


conference on biomimetic and biohybrid systems | 2012

How Can Embodiment Simplify the Problem of View-Based Navigation?

Andrew Philippides; Bart Baddeley; Philip Husbands; Paul Graham

This paper is a review of our recent work in which we study insect navigation as a situated and embodied system. This approach has led directly to a novel biomimetic model of route navigation in desert ants. The model is attractive due to the parsimonious algorithm and robust performance. We therefore believe it is an excellent candidate for robotic implementation.


european conference on artificial life | 2013

A Situated and Embodied Model of Ant Route Navigation

Andrew Philippides; Bart Baddeley; Philip Husbands; Paul Graham

This abstract summarises a model of route navigation inspired by the behaviour of ants presented fully in Baddeley et al. (2012). The ant‟s embodiment coupled with an innate scanning behaviour means that robust route navigation can be achieved by a parsimonious biologically plausible algorithm. The ability of social insects to learn long foraging routes guided by visual information (Wehner, 2009) shows that robust spatial behaviour can be produced with limited neural resources (Chittka and Skorupski, 2011). As such, social insects have become an important model system for understanding the minimal cognitive requirements for navigation and, more generally those studying animal cognition using a bottom-up approach to the understanding of natural intelligence (Wehner, 2009, Shettleworth, 2010) while also providing inspiration for biomimetic engineers. Models of visual navigation that have been successful in replicating place homing are dominated by snapshot-type models where a single view of the world as memorized from the goal location is compared to the current view in order to drive a search for the goal (Cartwright and Collet, 1983; for review, see Moller and Vardy, 2006). Snapshot approaches only allow for navigation in the immediate vicinity of the goal however, and do not achieve robust route navigation over longer distances (Smith et al., 2007). Here we present an embodied parsimonious model of visually guided route learning that addresses these issues (Baddeley et al., 2012). By utilising the interaction of sensori-motor constraints and observed innate behaviours we show that it is possible to produce robust behaviour using a learnt holistic representation of a route. Furthermore, we show that the model captures the known properties of route navigation in desert ants.

Collaboration


Dive into the Bart Baddeley's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge