Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Mario Prsa is active.

Publication


Featured researches published by Mario Prsa.


Journal of Vision | 2015

Learning to integrate contradictory multisensory self-motion cue pairings

Mariia Kaliuzhna; Mario Prsa; Steven Gale; Stella J. Lee; Olaf Blanke

Humans integrate multisensory information to reduce perceptual uncertainty when perceiving the world and self. Integration fails, however, if a common causality is not attributed to the sensory signals, as would occur in conditions of spatiotemporal discrepancies. In the case of passive self-motion, visual and vestibular cues are integrated according to statistical optimality, yet the extent of cue conflicts that do not compromise this optimality is currently underexplored. Here, we investigate whether human subjects can learn to integrate two arbitrary, but co-occurring, visual and vestibular cues of self-motion. Participants made size comparisons between two successive whole-body rotations using only visual, only vestibular, and both modalities together. The vestibular stimulus provided a yaw self-rotation cue, the visual a roll (Experiment 1) or pitch (Experiment 2) rotation cue. Experimentally measured thresholds in the bimodal condition were compared with theoretical predictions derived from the single-cue thresholds. Our results show that human subjects combine and optimally integrate vestibular and visual information, each signaling self-motion around a different rotation axis (yaw vs. roll and yaw vs. pitch). This finding suggests that the experience of two temporally co-occurring but spatially unrelated self-motion cues leads to inferring a common cause for these two initially unrelated sources of information about self-motion. We discuss our results in terms of specific task demands, cross-modal adaptation, and spatial compatibility. The importance of these results for the understanding of bodily illusions is also discussed.


Journal of Neurophysiology | 2015

Inference of perceptual priors from path dynamics of passive self-motion

Mario Prsa; Danilo Jimenez-Rezende; Olaf Blanke

The monitoring of ones own spatial orientation depends on the ability to estimate successive self-motion cues accurately. This process has become to be known as path integration. A feature of sequential cue estimation, in general, is that the history of previously experienced stimuli, or priors, biases perception. Here, we investigate how during angular path integration, the prior imparted by the displacement path dynamics affects the translation of vestibular sensations into perceptual estimates. Subjects received successive whole-body yaw rotations and were instructed to report their position within a virtual scene after each rotation. The overall movement trajectory either followed a parabolic path or was devoid of explicit dynamics. In the latter case, estimates were biased toward the average stimulus prior and were well captured by an optimal Bayesian estimator model fit to the data. However, the use of parabolic paths reduced perceptual uncertainty, and a decrease of the average size of bias and thus the weight of the average stimulus prior were observed over time. The produced estimates were, in fact, better accounted for by a model where a prediction of rotation magnitude is inferred from the underlying path dynamics on each trial. Therefore, when passively displaced, we seem to be able to build, over time, from sequential vestibular measurements an internal model of the vehicles movement dynamics. Our findings suggest that in ecological conditions, vestibular afference can be internally predicted, even when self-motion is not actively generated by the observer, thereby augmenting both the accuracy and precision of displacement perception.


Neuron | 2017

Rapid Integration of Artificial Sensory Feedback during Operant Conditioning of Motor Cortex Neurons

Mario Prsa; Gregorio Luis Galiñanes; Daniel Huber

Summary Neuronal motor commands, whether generating real or neuroprosthetic movements, are shaped by ongoing sensory feedback from the displacement being produced. Here we asked if cortical stimulation could provide artificial feedback during operant conditioning of cortical neurons. Simultaneous two-photon imaging and real-time optogenetic stimulation were used to train mice to activate a single neuron in motor cortex (M1), while continuous feedback of its activity level was provided by proportionally stimulating somatosensory cortex. This artificial signal was necessary to rapidly learn to increase the conditioned activity, detect correct performance, and maintain the learned behavior. Population imaging in M1 revealed that learning-related activity changes are observed in the conditioned cell only, which highlights the functional potential of individual neurons in the neocortex. Our findings demonstrate the capacity of animals to use an artificially induced cortical channel in a behaviorally relevant way and reveal the remarkable speed and specificity at which this can occur.


Journal of Neurophysiology | 2016

Oscillatory neural responses evoked by natural vestibular stimuli in humans

Steven Gale; Mario Prsa; Aaron Schurger; Aurore C. Paillard; Bruno Herbelin; Jean-Philippe Guyot; Christophe Lopez; Olaf Blanke

While there have been numerous studies of the vestibular system in mammals, less is known about the brain mechanisms of vestibular processing in humans. In particular, of the studies that have been carried out in humans over the last 30 years, none has investigated how vestibular stimulation (VS) affects cortical oscillations. Here we recorded high-density electroencephalography (EEG) in healthy human subjects and a group of bilateral vestibular loss patients (BVPs) undergoing transient and constant-velocity passive whole body yaw rotations, focusing our analyses on the modulation of cortical oscillations in response to natural VS. The present approach overcame significant technical challenges associated with combining natural VS with human electrophysiology and reveals that both transient and constant-velocity VS are associated with a prominent suppression of alpha power (8-13 Hz). Alpha band suppression was localized over bilateral temporo-parietal scalp regions, and these alpha modulations were significantly smaller in BVPs. We propose that suppression of oscillations in the alpha band over temporo-parietal scalp regions reflects cortical vestibular processing, potentially comparable with alpha and mu oscillations in the visual and sensorimotor systems, respectively, opening the door to the investigation of human cortical processing under various experimental conditions during natural VS.


Current Biology | 2018

Pupil Size Coupling to Cortical States Protects the Stability of Deep Sleep via Parasympathetic Modulation.

Özge Yüzgeç; Mario Prsa; Robert A. Zimmermann; Daniel Huber

Summary During wakefulness, pupil diameter can reflect changes in attention, vigilance, and cortical states. How pupil size relates to cortical activity during sleep, however, remains unknown. Pupillometry during natural sleep is inherently challenging since the eyelids are usually closed. Here, we present a novel head-fixed sleep paradigm in combination with infrared back-illumination pupillometry (iBip) allowing robust tracking of pupil diameter in sleeping mice. We found that pupil size can be used as a reliable indicator of sleep states and that cortical activity becomes tightly coupled to pupil size fluctuations during non-rapid eye movement (NREM) sleep. Pharmacological blocking experiments indicate that the observed pupil size changes during sleep are mediated via the parasympathetic system. We furthermore found that constrictions of the pupil during NREM episodes might play a protective role for stability of sleep depth. These findings reveal a fundamental relationship between cortical activity and pupil size, which has so far been hidden behind closed eyelids.


bioRxiv | 2018

Frequency selective encoding of substrate vibrations in the somatosensory cortex

Mario Prsa; Daniel Huber

Sensing vibrations that propagate through solid substrates conveys fundamental information about moving objects and other nearby dynamic events. Here we report that neurons responsive to substrate vibrations applied to the mouse forelimb reveal a new way of representing frequency information in the primary somatosensory cortex (S1). In contrast to vibrotactile stimulation of primate glabrous skin, which produces temporally entrained spiking and frequency independent firing rates, we found that mouse S1 neurons rely on a different coding scheme: their spike rates are conspicuously tuned to a preferred frequency of the stimulus. Histology, peripheral nerve block and optogenetic tagging experiments furthermore reveal that these responses are associated with the activation of mechanoreceptors located in deep subdermal tissue of the distal forelimb. We conclude that the encoding of frequency information of substrate-borne vibrations in the mouse S1 might be analogous to the representation of pitch of airborne sound in auditory cortex.


Neuropsychologia | 2018

Optimal visuo-vestibular integration for self-motion perception in patients with unilateral vestibular loss

Mariia Kaliuzhna; Steven Gale; Mario Prsa; Raphael Maire; Olaf Blanke

ABSTRACT Unilateral vestibular loss (UVL) is accompanied by deficits in processing of visual and vestibular self‐motion cues. The present study examined whether multisensory integration of these two types of information is, nevertheless, intact in such patients. Patients were seated on a rotating platform with a screen simulating 3D rotation in front of them and asked to judge the relative magnitude of two successive rotations in the yaw plane in three conditions: vestibular stimulation, visual stimulation and bimodal stimulation (congruent stimuli from both modalities together). Similar to findings in healthy controls, UVL patients exhibited optimal multisensory integration during both ipsi‐ and contralesional rotations. The benefit of multisensory integration was more pronounced on the ipsilesional side. These results show that visuo‐vestibular integration for passive self‐motion is automatic and suggests that it functions without additional cognitive mechanisms, unlike more complex multisensory tasks such as postural control and spatial navigation, previously shown to be impaired in UVL patients. HIGHLIGHTSPatients with unilateral vestibular loss integrate visuo‐vestibular cues.This integration is optimal.Optimality is maintained for contra‐ and ipsilesional rotations.


Multisensory Research | 2013

Rotating straight ahead or translating in circles: How we learn to integrate contradictory multisensory self-motion cue pairings

Mariia Kaliuzhna; Olaf Blanke; Mario Prsa

Humans integrate multisensory information to reduce perceptual uncertainty when perceiving the world (Hillis et al., 2002, 2004) and self (Butler et al., 2010; Prsa et al., 2012) and it has been shown that two multisensory cues are combined and give rise to a single percept only if attributed to the same causal event (Koerding et al., 2007; Parise et al., 2012; Shams and Beierholm, 2010). A growing body of literature studies the limits of such integration for bodily self-consciousness and the perception of self-location under normal and pathological conditions (Ionta et al., 2011). We extend this research by investigating whether human subjects can learn to integrate two arbitrary visual and vestibular cues of self-motion due to their temporal co-occurrence. We conducted two experiments ( N = 8 each) in which whole-body rotations were used as the vestibular stimulus and optic flow as the visual stimulus. The vestibular stimulus provided a yaw self-rotation cue, the visual — a roll (experiment 1) or pitch (experiment 2) rotation cue. Subjects made a relative size comparison between a standard rotation size and a variable test rotation size. Their discrimination performance was fit with a psychometric function and perceptual discrimination thresholds were extracted. We compared experimentally measured thresholds in the bimodal condition with theoretical predictions derived from the single cue thresholds. Our results show that human subjects can learn to combine and optimally integrate vestibular and visual information, each signaling self-motion around a different rotation axis (yaw versus roll as well as pitch). This finding suggests that the experience of two temporally co-occurring but spatially unrelated self-motion cues leads to inferring a common cause to these two initially unrelated sources of information about self-motion.


Journal of Neurophysiology | 2012

Self-motion leads to mandatory cue fusion across sensory modalities.

Mario Prsa; Steven Gale; Olaf Blanke


Journal of Neurophysiology | 2007

Visual-Vestibular Interaction Hypothesis for the Control of Orienting Gaze Shifts by Brain Stem Omnipause Neurons

Mario Prsa; Henrietta L. Galiana

Collaboration


Dive into the Mario Prsa's collaboration.

Top Co-Authors

Avatar

Olaf Blanke

École Polytechnique Fédérale de Lausanne

View shared research outputs
Top Co-Authors

Avatar

Steven Gale

École Polytechnique Fédérale de Lausanne

View shared research outputs
Top Co-Authors

Avatar

Danilo Jimenez-Rezende

École Polytechnique Fédérale de Lausanne

View shared research outputs
Top Co-Authors

Avatar

Bruno Herbelin

École Polytechnique Fédérale de Lausanne

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Mariia Kaliuzhna

École Polytechnique Fédérale de Lausanne

View shared research outputs
Top Co-Authors

Avatar

Stella J. Lee

École Polytechnique Fédérale de Lausanne

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Daniel M. Merfeld

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Aaron Schurger

École Polytechnique Fédérale de Lausanne

View shared research outputs
Researchain Logo
Decentralizing Knowledge