Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Alexis Pérez-Bellido is active.

Publication


Featured researches published by Alexis Pérez-Bellido.


Journal of Neurophysiology | 2013

Sound-driven enhancement of vision: disentangling detection-level from decision-level contributions.

Alexis Pérez-Bellido; Salvador Soto-Faraco; Joan López-Moliner

Cross-modal enhancement can be mediated both by higher-order effects due to attention and decision making and by detection-level stimulus-driven interactions. However, the contribution of each of these sources to behavioral improvements has not been conclusively determined and quantified separately. Here, we apply psychophysical analysis based on Piéron functions in order to separate stimulus-dependent changes from those accounted by decision-level contributions. Participants performed a simple visual speeded detection task on Gabor patches of different spatial frequencies and contrast values, presented with and without accompanying sounds. On one hand, we identified an additive cross-modal improvement in mean reaction times across all types of visual stimuli that would be well explained by interactions not strictly based on stimulus-driven modulations (e.g., due to reduction of temporal uncertainty and motor times). On the other hand, we singled out an audio-visual benefit that strongly depended on stimulus features such as frequency and contrast. This particular enhancement was selective to low-visual spatial frequency stimuli, optimized for magnocellular sensitivity. We therefore conclude that interactions at detection stages and at decisional processes in response selection that contribute to audio-visual enhancement can be separated online and express on partly different aspects of visual processing.


Experimental Brain Research | 2014

On the ‘visual’ in ‘audio-visual integration’: a hypothesis concerning visual pathways

Philip Jaekl; Alexis Pérez-Bellido; Salvador Soto-Faraco

Abstract Crossmodal interaction conferring enhancement in sensory processing is nowadays widely accepted. Such benefit is often exemplified by neural response amplification reported in physiological studies conducted with animals, which parallel behavioural demonstrations of sound-driven improvement in visual tasks in humans. Yet, a good deal of controversy still surrounds the nature and interpretation of these human psychophysical studies. Here, we consider the interpretation of crossmodal enhancement findings under the light of the functional as well as anatomical specialization of magno- and parvocellular visual pathways, whose paramount relevance has been well established in visual research but often overlooked in crossmodal research. We contend that a more explicit consideration of this important visual division may resolve some current controversies and help optimize the design of future crossmodal research.


Journal of Neurophysiology | 2015

Deconstructing multisensory enhancement in detection

Mario Pannunzi; Alexis Pérez-Bellido; Alexandre Pereda-Baños; Joan López-Moliner; Gustavo Deco; Salvador Soto-Faraco

The mechanisms responsible for the integration of sensory information from different modalities have become a topic of intense interest in psychophysics and neuroscience. Many authors now claim that early, sensory-based cross-modal convergence improves performance in detection tasks. An important strand of supporting evidence for this claim is based on statistical models such as the Pythagorean model or the probabilistic summation model. These models establish statistical benchmarks representing the best predicted performance under the assumption that there are no interactions between the two sensory paths. Following this logic, when observed detection performances surpass the predictions of these models, it is often inferred that such improvement indicates cross-modal convergence. We present a theoretical analyses scrutinizing some of these models and the statistical criteria most frequently used to infer early cross-modal interactions during detection tasks. Our current analysis shows how some common misinterpretations of these models lead to their inadequate use and, in turn, to contradictory results and misleading conclusions. To further illustrate the latter point, we introduce a model that accounts for detection performances in multimodal detection tasks but for which surpassing of the Pythagorean or probabilistic summation benchmark can be explained without resorting to early cross-modal interactions. Finally, we report three experiments that put our theoretical interpretation to the test and further propose how to adequately measure multimodal interactions in audiotactile detection tasks.


Journal of Neurophysiology | 2017

Auditory adaptation improves tactile frequency perception

Lexi E. Crommett; Alexis Pérez-Bellido; Jeffrey M. Yau

Our ability to process temporal frequency information by touch underlies our capacity to perceive and discriminate surface textures. Auditory signals, which also provide extensive temporal frequency information, can systematically alter the perception of vibrations on the hand. How auditory signals shape tactile processing is unclear; perceptual interactions between contemporaneous sounds and vibrations are consistent with multiple neural mechanisms. Here we used a crossmodal adaptation paradigm, which separated auditory and tactile stimulation in time, to test the hypothesis that tactile frequency perception depends on neural circuits that also process auditory frequency. We reasoned that auditory adaptation effects would transfer to touch only if signals from both senses converge on common representations. We found that auditory adaptation can improve tactile frequency discrimination thresholds. This occurred only when adaptor and test frequencies overlapped. In contrast, auditory adaptation did not influence tactile intensity judgments. Thus auditory adaptation enhances touch in a frequency- and feature-specific manner. A simple network model in which tactile frequency information is decoded from sensory neurons that are susceptible to auditory adaptation recapitulates these behavioral results. Our results imply that the neural circuits supporting tactile frequency perception also process auditory signals. This finding is consistent with the notion of supramodal operators performing canonical operations, like temporal frequency processing, regardless of input modality.NEW & NOTEWORTHY Auditory signals can influence the tactile perception of temporal frequency. Multiple neural mechanisms could account for the perceptual interactions between contemporaneous auditory and tactile signals. Using a crossmodal adaptation paradigm, we found that auditory adaptation causes frequency- and feature-specific improvements in tactile perception. This crossmodal transfer of aftereffects between audition and touch implies that tactile frequency perception relies on neural circuits that also process auditory frequency.


Seeing and Perceiving | 2012

Scrutinizing integrative effects in a multi-stimuli detection task

Mario Pannunzi; Alexis Pérez-Bellido; Alexandre Pereda Baños; Joan López-Moliner; Gustavo Deco; Salvador Soto-Faraco

The level of processing at which different modalities interact to either facilitate or interfere with detection has been a matter of debate for more than half a century. This question has been mainly addressed by means of statistical models (Green, 1958), or by biologically plausible models (Schnupp et al., 2005). One of the most widely accepted statistical frameworks is the signal detection theory (SDT; Green and Swets, 1966) because it provides a straightforward way to assess whether two sensory stimuli are judged independently of one another, that is when the detectability (d′) of the compound stimulus exceeds the Pythagorean sum of the d′ of the components. Here, we question this logic, and propose a different baseline to evaluate integrative effects in multi-stimuli detection tasks based on the probabilistic summation. To this aim, we show how a simple theoretical hypothesis based on probabilistic summation can explain putative multisensory enhancement in an audio-tactile detection task. In addition, we illustrate how to measure integrative effects from multiple stimuli in two experiments, one using a multisensory audio-tactile detection task (Experiment 1) and another with a unimodal double-stimulus auditory detection task (Experiment 2). Results from Experiment 1 replicate extant multisensory detection data, and also refuse the hypothesis that auditory and tactile stimuli integrated into a single percept, leading to any enhancement. In Experiment 2, we further support the probabilistic summation model using a unimodal integration detection task.


Seeing and Perceiving | 2012

Sounds prevent selective monitoring of high spatial frequency channels in vision

Alexis Pérez-Bellido; Salvador Soto-Faraco; Joan López-Moliner

Prior knowledge about the spatial frequency (SF) of upcoming visual targets (Gabor patches) speeds up average reaction times and decreases standard deviation. This has often been regarded as evidence for a multichannel processing of SF in vision. Multisensory research, on the other hand, has often reported the existence of sensory interactions between auditory and visual signals. These interactions result in enhancements in visual processing, leading to lower sensory thresholds and/or more precise visual estimates. However, little is known about how multisensory interactions may affect the uncertainty regarding visual SF. We conducted a reaction time study in which we manipulated the uncertanty about SF (SF was blocked or interleaved across trials) of visual targets, and compared visual only versus audio–visual presentations. Surprisingly, the analysis of the reaction times and their standard deviation revealed an impairment of the selective monitoring of the SF channel by the presence of a concurrent sound. Moreover, this impairment was especially pronounced when the relevant channels were high SFs at high visual contrasts. We propose that an accessory sound automatically favours visual processing of low SFs through the magnocellular channels, thereby detracting from the potential benefits from tuning into high SF psychophysical-channels.


I-perception | 2011

What is Sensory about Multi-Sensory Enhancement of Vision by Sounds?

Alexis Pérez-Bellido; Salvador Soto-Faraco; Joan López-Moliner

Can auditory input influence the sensory processing of visual information? Many studies have reported cross-modal enhancement in visual tasks, but the nature of such gain is still unclear. Some authors argue for ‘high-order’ expectancy or attention effects, whereas others propose ‘low-order’ stimulus-driven multisensory integration. The present study applies a psychophysical analysis of reaction time distributions in order to disentangle sensory changes from other kind of high-order (not sensory-specific) effects. Observers performed a speeded simple detection task on Gabor patches of different spatial frequencies and contrasts, with and without accompanying sounds. The data were adjusted using chronometric functions in order to separate changes is sensory evidence from changes in decision or motor times. The results supported the existence of a stimulus unspecific auditory-induced enhancement in RTs across all types of visual stimuli, probably mediated by higher-order effects (eg, reduction of temporal uncertainty). Critically, we also singled out a sensory gain that was selective to low spatial frequency stimuli, highlighting the role of the magno-cellular visual pathway in multisensory integration for fast detection. The present findings help clarify previous mixed findings in the area, and introduce a novel form to evaluate cross-modal enhancement.


Current Biology | 2014

Multisensory integration and attention in developmental dyslexia.

Vanessa Harrar; Jonathan Tammam; Alexis Pérez-Bellido; Anna Pitt; John F. Stein; Charles Spence


Journal of Vision | 2015

Visual limitations shape audio-visual integration.

Alexis Pérez-Bellido; Marc O. Ernst; Salvador Soto-Faraco; Joan López-Moliner


Archive | 2015

Cortex of Awake Primates Frequency Modulations in the Primary Auditory Neural Representations of Sinusoidal Amplitude and

J NeurophysiolThomas Lu; Xiaoqin Wang; Salvador Soto-Faraco; Mario Pannunzi; Alexis Pérez-Bellido; Alexandre Pereda-Baños; Joan López-Moliner; Brian J. Malone; Ralph E. Beitel; Maike Vollmer; Marc A. Heiser; Christoph E. Schreiner; Joshua D. Downer; Mamiko Niwa; Mitchell L. Sutter

Collaboration


Dive into the Alexis Pérez-Bellido's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Gustavo Deco

Pompeu Fabra University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jeffrey M. Yau

Baylor College of Medicine

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Lexi E. Crommett

Baylor College of Medicine

View shared research outputs
Top Co-Authors

Avatar

Maike Vollmer

University of California

View shared research outputs
Researchain Logo
Decentralizing Knowledge