Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jens Peter Lindemann is active.

Publication


Featured researches published by Jens Peter Lindemann.


PLOS Biology | 2005

Function of a fly motion-sensitive neuron matches eye movements during free flight.

Roland Kern; van Johannes Hateren; Christian Michaelis; Jens Peter Lindemann; Martin Egelhaaf

Sensing is often implicitly assumed to be the passive acquisition of information. However, part of the sensory information is generated actively when animals move. For instance, humans shift their gaze actively in a sequence of saccades towards interesting locations in a scene. Likewise, many insects shift their gaze by saccadic turns of body and head, keeping their gaze fixed between saccades. Here we employ a novel panoramic virtual reality stimulator and show that motion computation in a blowfly visual interneuron is tuned to make efficient use of the characteristic dynamics of retinal image flow. The neuron is able to extract information about the spatial layout of the environment by utilizing intervals of stable vision resulting from the saccadic viewing strategy. The extraction is possible because the retinal image flow evoked by translation, containing information about object distances, is confined to low frequencies. This flow component can be derived from the total optic flow between saccades because the residual intersaccadic head rotations are small and encoded at higher frequencies. Information about the spatial layout of the environment can thus be extracted by the neuron in a computationally parsimonious way. These results on neuronal function based on naturalistic, behaviourally generated optic flow are in stark contrast to conclusions based on conventional visual stimuli that the neuron primarily represents a detector for yaw rotations of the animal.


The Journal of Neuroscience | 2005

On the Computations Analyzing Natural Optic Flow: Quantitative Model Analysis of the Blowfly Motion Vision Pathway

Jens Peter Lindemann; Roland Kern; Jh van Hateren; Helge Ritter; Martin Egelhaaf

For many animals, including humans, the optic flow generated on the eyes during locomotion is an important source of information about self-motion and the structure of the environment. The blowfly has been used frequently as a model system for experimental analysis of optic flow processing at the microcircuit level. Here, we describe a model of the computational mechanisms implemented by these circuits in the blowfly motion vision pathway. Although this model was originally proposed based on simple experimenter-designed stimuli, we show that it is also capable to quantitatively predict the responses to the complex dynamic stimuli a blowfly encounters in free flight. In particular, the model visual system exploits the active saccadic gaze and flight strategy of blowflies in a similar way, as does its neuronal counterpart. The model circuit extracts information about translation velocity in the intersaccadic intervals and thus, indirectly, about the three-dimensional layout of the environment. By stepwise dissection of the model circuit, we determine which of its components are essential for these remarkable features. When accounting for the responses to complex natural stimuli, the model is much more robust against parameter changes than when explaining the neuronal responses to simple experimenter-defined stimuli. In contrast to conclusions drawn from experiments with simple stimuli, optimization of the parameter set for different segments of natural optic flow stimuli do not indicate pronounced adaptational changes of these parameters during long-lasting stimulation.


Vision Research | 2003

FliMax, a novel stimulus device for panoramic and highspeed presentation of behaviourally generated optic flow

Jens Peter Lindemann; Roland Kern; C Michaelis; P Meyer; Jh van Hateren; Martin Egelhaaf

A high-speed panoramic visual stimulation device is introduced which is suitable to analyse visual interneurons during stimulation with rapid image displacements as experienced by fast moving animals. The responses of an identified motion sensitive neuron in the visual system of the blowfly to behaviourally generated image sequences are very complex and hard to predict from the established input circuitry of the neuron. This finding suggests that the computational significance of visual interneurons can only be assessed if they are characterised not only by conventional stimuli as are often used for systems analysis, but also by behaviourally relevant input.


Frontiers in Neural Circuits | 2012

Spatial vision in insects is facilitated by shaping the dynamics of visual input through behavioral action

Martin Egelhaaf; Norbert Boeddeker; Roland Kern; Rafael Kurtz; Jens Peter Lindemann

Insects such as flies or bees, with their miniature brains, are able to control highly aerobatic flight maneuvres and to solve spatial vision tasks, such as avoiding collisions with obstacles, landing on objects, or even localizing a previously learnt inconspicuous goal on the basis of environmental cues. With regard to solving such spatial tasks, these insects still outperform man-made autonomous flying systems. To accomplish their extraordinary performance, flies and bees have been shown by their characteristic behavioral actions to actively shape the dynamics of the image flow on their eyes (“optic flow”). The neural processing of information about the spatial layout of the environment is greatly facilitated by segregating the rotational from the translational optic flow component through a saccadic flight and gaze strategy. This active vision strategy thus enables the nervous system to solve apparently complex spatial vision tasks in a particularly efficient and parsimonious way. The key idea of this review is that biological agents, such as flies or bees, acquire at least part of their strength as autonomous systems through active interactions with their environment and not by simply processing passively gained information about the world. These agent-environment interactions lead to adaptive behavior in surroundings of a wide range of complexity. Animals with even tiny brains, such as insects, are capable of performing extraordinarily well in their behavioral contexts by making optimal use of the closed action–perception loop. Model simulations and robotic implementations show that the smart biological mechanisms of motion computation and visually-guided flight control might be helpful to find technical solutions, for example, when designing micro air vehicles carrying a miniaturized, low-weight on-board processor.


Journal of Comparative Physiology A-neuroethology Sensory Neural and Behavioral Physiology | 2005

Responses of blowfly motion-sensitive neurons to reconstructed optic flow along outdoor flight paths

Norbert Boeddeker; Jens Peter Lindemann; Martin Egelhaaf; Jochen Zeil

The retinal image flow a blowfly experiences in its daily life on the wing is determined by both the structure of the environment and the animal’s own movements. To understand the design of visual processing mechanisms, there is thus a need to analyse the performance of neurons under natural operating conditions. To this end, we recorded flight paths of flies outdoors and reconstructed what they had seen, by moving a panoramic camera along exactly the same paths. The reconstructed image sequences were later replayed on a fast, panoramic flight simulator to identified, motion sensitive neurons of the so-called horizontal system (HS) in the lobula plate of the blowfly, which are assumed to extract self-motion parameters from optic flow. We show that under real life conditions HS-cells not only encode information about self-rotation, but are also sensitive to translational optic flow and, thus, indirectly signal information about the depth structure of the environment. These properties do not require an elaboration of the known model of these neurons, because the natural optic flow sequences generate—at least qualitatively—the same depth-related response properties when used as input to a computational HS-cell model and to real neurons.


Biological Cybernetics | 2008

Saccadic flight strategy facilitates collision avoidance: closed-loop performance of a cyberfly

Jens Peter Lindemann; Holger Weiss; Ralf Möller; Martin Egelhaaf

Behavioural and electrophysiological experiments suggest that blowflies employ an active saccadic strategy of flight and gaze control to separate the rotational from the translational optic flow components. As a consequence, this allows motion sensitive neurons to encode during translatory intersaccadic phases of locomotion information about the spatial layout of the environment. So far, it has not been clear whether and how a motor controller could decode the responses of these neurons to prevent a blowfly from colliding with obstacles. Here we propose a simple model of the blowfly visual course control system, named cyberfly, and investigate its performance and limitations. The sensory input module of the cyberfly emulates a pair of output neurons subserving the two eyes of the blowfly visual motion pathway. We analyse two sensory-motor interfaces (SMI). An SMI coupling the differential signal of the sensory neurons proportionally to the yaw rotation fails to avoid obstacles. A more plausible SMI is based on a saccadic controller. Even with sideward drift after saccades as is characteristic of real blowflies, the cyberfly is able to successfully avoid collisions with obstacles. The relative distance information contained in the optic flow during translatory movements between saccades is provided to the SMI by the responses of the visual output neurons. An obvious limitation of this simple mechanism is its strong dependence on the textural properties of the environment.


Network: Computation In Neural Systems | 2001

Neuronal processing of behaviourally generated optic flow: experiments and model simulations.

Roland Kern; Maik Lutterklas; Christian Petereit; Jens Peter Lindemann; Martin Egelhaaf

The stimuli traditionally used for analysing visual information processing are much simpler than what an animal sees when moving in its natural environment. Therefore, we analysed in a previous study the performance of an identified neuron in the optomotor system of the fly by using as visual stimuli image sequences that were experienced by the animal while walking in a structured environment. These electrophysiological experiments revealed that the fly visual system computes from behaviourally generated optic flow a rather unambiguous representation of the animals self-motion. In contrast to conclusions based on simple stimuli, the directions of turns are represented by an interneuron, the HSE cell, quite independent of the spatial layout of the environment and its textural properties when the cell is stimulated with behaviourally generated optic flow. This conclusion is substantiated here by further experimental evidence. Moreover, it is shown that the largely unambiguous responses of the HSE cell to behaviourally generated optic flow can be replicated to a large extent by a network model of the flys visual motion pathway. These results stress the significance of naturalistic stimuli for analysing what is encoded by neuronal circuits under natural operating conditions.


Frontiers in Computational Neuroscience | 2014

Depth information in natural environments derived from optic flow by insect motion detection system: a model analysis

Alexander Schwegmann; Jens Peter Lindemann; Martin Egelhaaf

Knowing the depth structure of the environment is crucial for moving animals in many behavioral contexts, such as collision avoidance, targeting objects, or spatial navigation. An important source of depth information is motion parallax. This powerful cue is generated on the eyes during translatory self-motion with the retinal images of nearby objects moving faster than those of distant ones. To investigate how the visual motion pathway represents motion-based depth information we analyzed its responses to image sequences recorded in natural cluttered environments with a wide range of depth structures. The analysis was done on the basis of an experimentally validated model of the visual motion pathway of insects, with its core elements being correlation-type elementary motion detectors (EMDs). It is the key result of our analysis that the absolute EMD responses, i.e., the motion energy profile, represent the contrast-weighted nearness of environmental structures during translatory self-motion at a roughly constant velocity. In other words, the output of the EMD array highlights contours of nearby objects. This conclusion is largely independent of the scale over which EMDs are spatially pooled and was corroborated by scrutinizing the motion energy profile after eliminating the depth structure from the natural image sequences. Hence, the well-established dependence of correlation-type EMDs on both velocity and textural properties of motion stimuli appears to be advantageous for representing behaviorally relevant information about the environment in a computationally parsimonious way.


PLOS Computational Biology | 2015

A Bio-inspired Collision Avoidance Model Based on Spatial Information Derived from Motion Detectors Leads to Common Routes

Olivier J. N. Bertrand; Jens Peter Lindemann; Martin Egelhaaf

Avoiding collisions is one of the most basic needs of any mobile agent, both biological and technical, when searching around or aiming toward a goal. We propose a model of collision avoidance inspired by behavioral experiments on insects and by properties of optic flow on a spherical eye experienced during translation, and test the interaction of this model with goal-driven behavior. Insects, such as flies and bees, actively separate the rotational and translational optic flow components via behavior, i.e. by employing a saccadic strategy of flight and gaze control. Optic flow experienced during translation, i.e. during intersaccadic phases, contains information on the depth-structure of the environment, but this information is entangled with that on self-motion. Here, we propose a simple model to extract the depth structure from translational optic flow by using local properties of a spherical eye. On this basis, a motion direction of the agent is computed that ensures collision avoidance. Flying insects are thought to measure optic flow by correlation-type elementary motion detectors. Their responses depend, in addition to velocity, on the texture and contrast of objects and, thus, do not measure the velocity of objects veridically. Therefore, we initially used geometrically determined optic flow as input to a collision avoidance algorithm to show that depth information inferred from optic flow is sufficient to account for collision avoidance under closed-loop conditions. Then, the collision avoidance algorithm was tested with bio-inspired correlation-type elementary motion detectors in its input. Even then, the algorithm led successfully to collision avoidance and, in addition, replicated the characteristics of collision avoidance behavior of insects. Finally, the collision avoidance algorithm was combined with a goal direction and tested in cluttered environments. The simulated agent then showed goal-directed behavior reminiscent of components of the navigation behavior of insects.


Frontiers in Neural Circuits | 2014

Motion as a source of environmental information: a fresh view on biological motion computation by insect brains

Martin Egelhaaf; Roland Kern; Jens Peter Lindemann

Despite their miniature brains insects, such as flies, bees and wasps, are able to navigate by highly erobatic flight maneuvers in cluttered environments. They rely on spatial information that is contained in the retinal motion patterns induced on the eyes while moving around (“optic flow”) to accomplish their extraordinary performance. Thereby, they employ an active flight and gaze strategy that separates rapid saccade-like turns from translatory flight phases where the gaze direction is kept largely constant. This behavioral strategy facilitates the processing of environmental information, because information about the distance of the animal to objects in the environment is only contained in the optic flow generated by translatory motion. However, motion detectors as are widespread in biological systems do not represent veridically the velocity of the optic flow vectors, but also reflect textural information about the environment. This characteristic has often been regarded as a limitation of a biological motion detection mechanism. In contrast, we conclude from analyses challenging insect movement detectors with image flow as generated during translatory locomotion through cluttered natural environments that this mechanism represents the contours of nearby objects. Contrast borders are a main carrier of functionally relevant object information in artificial and natural sceneries. The motion detection system thus segregates in a computationally parsimonious way the environment into behaviorally relevant nearby objects and—in many behavioral contexts—less relevant distant structures. Hence, by making use of an active flight and gaze strategy, insects are capable of performing extraordinarily well even with a computationally simple motion detection mechanism.

Collaboration


Dive into the Jens Peter Lindemann's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge