Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Laura Dittmar is active.

Publication


Featured researches published by Laura Dittmar.


Frontiers in Behavioral Neuroscience | 2012

Prototypical components of honeybee homing flight behavior depend on the visual appearance of objects surrounding the goal.

Elke Braun; Laura Dittmar; Norbert Boeddeker; Martin Egelhaaf

Honeybees use visual cues to relocate profitable food sources and their hive. What bees see while navigating, depends on the appearance of the cues, the bee’s current position, orientation, and movement relative to them. Here we analyze the detailed flight behavior during the localization of a goal surrounded by cylinders that are characterized either by a high contrast in luminance and texture or by mostly motion contrast relative to the background. By relating flight behavior to the nature of the information available from these landmarks, we aim to identify behavioral strategies that facilitate the processing of visual information during goal localization. We decompose flight behavior into prototypical movements using clustering algorithms in order to reduce the behavioral complexity. The determined prototypical movements reflect the honeybee’s saccadic flight pattern that largely separates rotational from translational movements. During phases of translational movements between fast saccadic rotations, the bees can gain information about the 3D layout of their environment from the translational optic flow. The prototypical movements reveal the prominent role of sideways and up- or downward movements, which can help bees to gather information about objects, particularly in the frontal visual field. We find that the occurrence of specific prototypes depends on the bees’ distance from the landmarks and the feeder and that changing the texture of the landmarks evokes different prototypical movements. The adaptive use of different behavioral prototypes shapes the visual input and can facilitate information processing in the bees’ visual system during local navigation.


The Journal of Experimental Biology | 2012

Blowfly flight characteristics are shaped by environmental features and controlled by optic flow information.

Roland Kern; Norbert Boeddeker; Laura Dittmar; Martin Egelhaaf

SUMMARY Blowfly flight consists of two main components, saccadic turns and intervals of mostly straight gaze direction, although, as a consequence of inertia, flight trajectories usually change direction smoothly. We investigated how flight behavior changes depending on the surroundings and how saccadic turns and intersaccadic translational movements might be controlled in arenas of different width with and without obstacles. Blowflies do not fly in straight trajectories, even when traversing straight flight arenas; rather, they fly in meandering trajectories. Flight speed and the amplitude of meanders increase with arena width. Although saccade duration is largely constant, peak angular velocity and succession into either direction are variable and depend on the visual surroundings. Saccade rate and amplitude also vary with arena layout and are correlated with the ‘time-to-contact’ to the arena wall. We provide evidence that both saccade and velocity control rely to a large extent on the intersaccadic optic flow generated in eye regions looking well in front of the fly, rather than in the lateral visual field, where the optic flow at least during forward flight tends to be strongest.


Frontiers in Behavioral Neuroscience | 2011

The behavioral relevance of landmark texture for honeybee homing.

Laura Dittmar; Martin Egelhaaf; Wolfgang Stürzl; Norbert Boeddeker

Honeybees visually pinpoint the location of a food source using landmarks. Studies on the role of visual memories have suggested that bees approach the goal by finding a close match between their current view and a memorized view of the goal location. The most relevant landmark features for this matching process seem to be their retinal positions, the size as defined by their edges, and their color. Recently, we showed that honeybees can use landmarks that are statically camouflaged, suggesting that motion cues are relevant as well. Currently it is unclear how bees weight these different landmark features when accomplishing navigational tasks, and whether this depends on their saliency. Since natural objects are often distinguished by their texture, we investigate the behavioral relevance and the interplay of the spatial configuration and the texture of landmarks. We show that landmark texture is a feature that bees memorize, and being given the opportunity to identify landmarks by their texture improves the bees’ navigational performance. Landmark texture is weighted more strongly than landmark configuration when it provides the bees with positional information and when the texture is salient. In the vicinity of the landmark honeybees changed their flight behavior according to its texture.


Communicative & Integrative Biology | 2011

Static and dynamic snapshots for goal localization in insects

Laura Dittmar

Bees, wasps and ants navigate successfully between feeding sites and their nest, despite the small size of their brains which contain less than a million neurons. A long history of studies examining the role of visual memories in homing behavior show that insects can localise a goal by finding a close match between a memorized view at the goal location and their current view (“snapshot matching”). However, the concept of static snapshot matching might not explain all aspects of homing behavior, as honeybees are able to use landmarks that are statically camouflaged. In this case the landmarks are only detectable by relative motion cues between the landmark and the background, which the bees generate when they perform characteristic flight manoeuvres close to the landmarks. The bees’ navigation performance can be explained by a matching scheme based on optic flow amplitudes (“dynamic snapshot matching”). In this article, I will discuss the concept of dynamic snapshot matching in the light of previous literature.


Frontiers in Behavioral Neuroscience | 2014

Visual motion-sensitive neurons in the bumblebee brain convey information about landmarks during a navigational task

Marcel Mertes; Laura Dittmar; Martin Egelhaaf; Norbert Boeddeker

Bees use visual memories to find the spatial location of previously learnt food sites. Characteristic learning flights help acquiring these memories at newly discovered foraging locations where landmarks—salient objects in the vicinity of the goal location—can play an important role in guiding the animals homing behavior. Although behavioral experiments have shown that bees can use a variety of visual cues to distinguish objects as landmarks, the question of how landmark features are encoded by the visual system is still open. Recently, it could be shown that motion cues are sufficient to allow bees localizing their goal using landmarks that can hardly be discriminated from the background texture. Here, we tested the hypothesis that motion sensitive neurons in the bees visual pathway provide information about such landmarks during a learning flight and might, thus, play a role for goal localization. We tracked learning flights of free-flying bumblebees (Bombus terrestris) in an arena with distinct visual landmarks, reconstructed the visual input during these flights, and replayed ego-perspective movies to tethered bumblebees while recording the activity of direction-selective wide-field neurons in their optic lobe. By comparing neuronal responses during a typical learning flight and targeted modifications of landmark properties in this movie we demonstrate that these objects are indeed represented in the bees visual motion pathway. We find that object-induced responses vary little with object texture, which is in agreement with behavioral evidence. These neurons thus convey information about landmark properties that are useful for view-based homing.


Animal Behaviour | 2014

Out of the box: how bees orient in an ambiguous environment

Laura Dittmar; Wolfgang Stürzl; Simon Jetzschke; Marcel Mertes; Norbert Boeddeker

How do bees employ multiple visual cues for homing? They could either combine the available cues using a view-based computational mechanism or pick one cue. We tested these strategies by training honeybees, Apis mellifera carnica, and bumblebees, Bombus terrestris, to locate food in one of the four corners of a box-shaped flight arena, providing multiple and also ambiguous cues. In tests, bees confused the diagonally opposite corners, which looked the same from the inside of the box owing to its rectangular shape and because these corners carried the same local colour cues. These ‘rotational errors’ indicate that the bees did not use compass information inferred from the geomagnetic field under our experimental conditions. When we then swapped cues between corners, bees preferred corners that had local cues similar to the trained corner, even when the geometric relations were incorrect. Apparently, they relied on views, a finding that we corroborated by computer simulations in which we assumed that bees try to match a memorized view of the goal location with the current view when they return to the box. However, when extra visual cues outside the box were provided, bees were able to resolve the ambiguity and locate the correct corner. We show that this performance cannot be explained by view matching from inside the box. Indeed, the bees adapted their behaviour and actively acquired information by leaving the arena and flying towards the cues outside the box. From there they re-entered the arena at the correct corner, now ignoring local cues that previously dominated their choices. All individuals of both species came up with this new behavioural strategy for solving the problem provided by the local ambiguity within the box. Thus both species seemed to be solving the ambiguous task by using their route memory, which is always available during their natural foraging behaviour.


Proceedings of the Royal Society of London. Series B, Biological Sciences | 2010

The fine structure of honeybee head and body yaw movements in a homing task

Norbert Boeddeker; Laura Dittmar; Wolfgang Stürzl; Martin Egelhaaf


The Journal of Experimental Biology | 2010

Goal seeking in honeybees: matching of optic flow snapshots?

Laura Dittmar; Wolfgang Stürzl; Emily Baird; Norbert Boeddeker; Martin Egelhaaf


Bioinspiration & Biomimetics | 2010

Mimicking honeybee eyes with a 280° field of view catadioptric imaging system

Wolfgang Stürzl; Norbert Boeddeker; Laura Dittmar; Martin Egelhaaf


PLOS ONE | 2015

Bumblebee Homing: The Fine Structure of Head Turning Movements

Norbert Boeddeker; Marcel Mertes; Laura Dittmar; Martin Egelhaaf

Collaboration


Dive into the Laura Dittmar's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge