Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Valeria Occelli is active.

Publication


Featured researches published by Valeria Occelli.


Neuroscience & Biobehavioral Reviews | 2011

Audiotactile interactions in front and rear space.

Valeria Occelli; Charles Spence; Massimiliano Zampini

The last few years have seen a growing interest in the assessment of audiotactile interactions in information processing in peripersonal space. In particular, these studies have focused on investigating peri-hand space [corrected] and, more recently, on the functional differences that have been demonstrated between the space close to front and back of the head (i.e., the peri-head space). In this review, the issue of how audiotactile interactions vary as a function of the region of space in which stimuli are presented (i.e., front vs. rear, peripersonal vs. extra-personal) will be described. We review evidence from both monkey and human studies. This evidence, providing insight into the differential attributes qualifying the frontal and the rear regions of space, sheds light on an until now neglected research topic and may help to contribute to the formulation of new rehabilitative approaches to disorders of spatial representation. A tentative explanation of the evolutionary reasons underlying these particular patterns of results, as well as suggestions for possible future developments, are also provided.


Psychonomic Bulletin & Review | 2011

Audiotactile interactions in temporal perception

Valeria Occelli; Charles Spence; Massimiliano Zampini

In the present review, we focus on how commonalities in the ontogenetic development of the auditory and tactile sensory systems may inform the interplay between these signals in the temporal domain. In particular, we describe the results of behavioral studies that have investigated temporal resolution (in temporal order, synchrony/asynchrony, and simultaneity judgment tasks), as well as temporal numerosity perception, and similarities in the perception of frequency across touch and hearing. The evidence reviewed here highlights features of audiotactile temporal perception that are distinctive from those seen for other pairings of sensory modalities. For instance, audiotactile interactions are characterized in certain tasks (e.g., temporal numerosity judgments) by a more balanced reciprocal influence than are other modality pairings. Moreover, relative spatial position plays a different role in the temporal order and temporal recalibration processes for audiotactile stimulus pairings than for other modality pairings. The effect exerted by both the spatial arrangement of stimuli and attention on temporal order judgments is described. Moreover, a number of audiotactile interactions occurring during sensory-motor synchronization are highlighted. We also look at the audiotactile perception of rhythm and how it may be affected by musical training. The differences emerging from this body of research highlight the need for more extensive investigation into audiotactile temporal interactions. We conclude with a brief overview of some of the key issues deserving of further research in this area.


Neuroreport | 2009

Compatibility effects between sound frequency and tactile elevation.

Valeria Occelli; Charles Spence; Massimiliano Zampini

Participants made speeded discrimination responses to unimodal auditory (low-frequency vs. high-frequency sounds) or vibrotactile stimuli (presented to the index finger, upper location vs. to the thumb, lower location). In the compatible blocks of trials, the implicitly related stimuli (i.e. higher-frequency sounds and upper tactile stimuli; and the lower-frequency sounds and the lower tactile stimuli) were associated with the same response key; in the incompatible blocks, weakly related stimuli (i.e. high-frequency sounds and lower tactile stimuli; and the low-frequency sounds and the upper tactile stimuli) were associated with the same response key. Better performance was observed in the compatible (vs. incompatible) blocks, thus providing empirical support for the cross-modal association between the relative frequency of a sound and the relative elevation of a tactile stimulus.


Psychological Bulletin | 2013

Auditory, tactile, and audiotactile information processing following visual deprivation.

Valeria Occelli; Charles Spence; Massimiliano Zampini

We highlight the results of those studies that have investigated the plastic reorganization processes that occur within the human brain as a consequence of visual deprivation, as well as how these processes give rise to behaviorally observable changes in the perceptual processing of auditory and tactile information. We review the evidence showing that visual deprivation affects the establishment of the spatial coordinate systems involved in the processing of auditory and tactile inputs within the peripersonal space around an individual. In blind individuals, the absence of a conjoint activation of external coordinate systems across modalities co-occurs with a higher capacity to direct auditory and tactile attentional resources to different spatial locations and to ignore irrelevant distractors. Both processes could thus contribute to the reduced spatial multisensory binding that has been observed in those who are blind. The interplay between auditory and tactile information in visually deprived individuals is modulated by attentional factors. Blind individuals typically outperform sighted people in those tasks where the target is presented in one sensory modality (and the other modality acts as a distractor). By contrast, they are less efficient in tasks explicitly requiring the combination of information across sensory modalities. The review highlights how these behavioral effects are subserved by extensive plastic changes at the neural level, with brain areas traditionally involved in visual functioning switching and being recruited for the processing of stimuli within the intact residual senses. We also discuss the roles played by other intervening factors with regard to compensatory mechanisms, such as previous visual experience, age at onset of blindness, and learning effects.


Perception | 2013

The Takete—Maluma Phenomenon in Autism Spectrum Disorders

Valeria Occelli; Gianluca Esposito; Paola Venuti; Giuseppe Maurizio Arduino; Massimiliano Zampini

It has been reported that people tend to preferentially associate phonemes like /m/, /l/, /n/ to curvilinear shapes and phonemes like /t/, /z/, /r/, /k/ to rectilinear shapes. Here we evaluated the performance of children/adolescents with autism spectrum disorders (ASD) and neurotypical controls in this audiovisual congruency phenomenon. Pairs of visual patterns (curvilinear vs rectilinear) were presented to a group of ASD participants (low- or high-functioning) and a group of age-matched neurotypical controls. Participants were asked to associate each item to non-meaningful phoneme clusters. ASD participants showed a lower proportion of expected association responses than the controls. Within the ASD group the performance varied as a function of the severity of the symptomatology. These data suggest that children/adolescents with ASD show, although at different degrees as a function of the severity of the ASD, lower phonetic-iconic congruency response patterns than neurotypical controls, pointing to poorer multisensory integration capabilities.


Neuropsychologia | 2012

Audiovisual integration in low vision individuals.

Stefano Targher; Valeria Occelli; Massimiliano Zampini

Behavioral and neurophysiological studies have shown an enhancement of visual perception in crossmodal audiovisual stimulation conditions, both for sensitivity and reaction times, when the stimulation in the two sensory modalities occurs in condition of space and time congruency. The purpose of the present work is to verify whether congruent visual and acoustic stimulations can improve the detection of visual stimuli in people affected by low vision. Participants were asked to detect the presence of a visual stimulus (yes/no task) either presented in isolation (i.e., unimodal visual stimulation) or simultaneously with auditory stimuli, which could be placed in the same spatial position (i.e., crossmodal congruent conditions) or in different spatial positions (i.e., crossmodal incongruent conditions). The results show for the first time audiovisual integration effects in low vision individuals. In particular, it has been observed a significant visual detection benefit in the crossmodal congruent as compared to the unimodal visual condition. This effect is selective for visual stimulation that occurs in the portion of visual field that is impaired, and disappears in the region of space in which vision is spared. Surprisingly, there is a marginal crossmodal benefit when the sound is presented at 16 degrees far from the visual stimulus. The observed crossmodal effect seems to be determined by the contribution of both senses to a model of optimal combination, in which the most reliable provides the highest contribution. These results, indicating a significant beneficial effect of synchronous and spatially congruent sounds in a visual detection task, seem very promising for the development of a rehabilitation approach of low vision diseases based on the principles of multisensory integration.


Quarterly Journal of Experimental Psychology | 2010

Assessing the effect of sound complexity on the audiotactile cross-modal dynamic capture task.

Valeria Occelli; Charles Spence; Massimiliano Zampini

Neurophysiological and behavioural evidence now show that audiotactile interactions are more pronounced for complex auditory stimuli than for pure tones. In the present study, we examined the effect of varying the complexity of auditory stimuli (i.e., noise vs. pure tone) on participants’ performance in the audiotactile cross-modal dynamic capture task. Participants discriminated the direction of a target stream (tactile or auditory) while simultaneously trying to ignore the direction of a distracting auditory or tactile apparent motion stream presented in a different sensory modality (i.e., auditory or tactile). The distractor stream could be either spatiotemporally congruent or incongruent with respect to the target stream on each trial. The results showed that sound complexity modulated performance, decreasing the accuracy of tactile direction judgements when presented simultaneously with noise distractors, while facilitating judgements of the direction of the noise bursts (as compared to pure tones). Although auditory direction judgements were overall more accurate for noise (than for pure tone) targets, the complexity of the sound failed to modulate the tactile capture of auditory targets. These results provide the first demonstration of enhanced audiotactile interactions involving complex (vs. pure tone) auditory stimuli in the peripersonal space around the hands (previously these effects have only been reported in the space around the head).


Journal of Vision | 2018

Decoding auditory motion direction and location in hMT+/V5 and Planum Temporale of sighted and blind individuals

Ceren Battal; Mohamed Rezk; Stefania Mattioni; Roberto Bottini; Giorgia Bertonati; Valeria Occelli; Stefano Targher; Olivier Collignon

The research presented in this thesis addresses the neural mechanisms of auditory motion processing and the impact of early visual deprivation on motion-responsive brain regions, by using functional magnetic resonance imaging. Visual motion, and in particular direction selectivity, is one of the most investigated aspects of mammalian brain function. In comparison, little is known about how the brain processes moving sounds. More precisely, we have a poor understanding of how the human brain codes for the direction of auditory motion and how this process differs from auditory sound-source localization. In the first study, we characterized the neural representations of auditory motion within the Planum Temporale (PT), and how motion direction and sound source location are represented within this auditory motion responsive region. We further explore if the distribution of orientation responsive neurons (topographic representations) within the PT shares similar organizational features to what is observed within the visual motion area MT/V5. The spatial representations would, therefore, be more systematic for axis of motion/space, rather than for within-axis direction/location. Despite the shared representations between auditory spatial conditions, we show that motion directions and sound source locations generate highly distinct patterns of activity. The second study focused on the impact of early visual deprivation on auditory motion processing. Studying visual deprivation-induced plasticity sheds light on how sensory experience alters the functional organization of motion processing areas, and exploits intrinsic computational bias implemented in cortical regions. In addition to enhanced auditory motion responses within the hMT+/V5, we demonstrate that this region maintains direction selectivity tuning, but enhances its modality preference to auditory input in case of early blindness. Crucially, the enhanced computational role of hMT+/V5 is followed by a reduced role of PT for processing both motion direction and sound source location. These results suggest that early blindness triggers interplay between visual and auditory motion areas, and their computational roles could be re-distributed for effective processing of auditory spatial tasks. Overall, our findings suggest (1) auditory motion-specific processing in the typically developed auditory cortex, and (2) interplay between cross- and intra-modal plasticity to compute auditory motion and space in early blind individuals.


Perception | 2017

The Role of Temporal Disparity on Audiovisual Integration in Low-Vision Individuals

Stefano Targher; Rocco Micciolo; Valeria Occelli; Massimiliano Zampini

Recent findings have shown that sounds improve visual detection in low vision individuals when the audiovisual stimuli pairs of stimuli are presented simultaneously and from the same spatial position. The present study purports to investigate the temporal aspects of the audiovisual enhancement effect previously reported. Low vision participants were asked to detect the presence of a visual stimulus (yes/no task) presented either alone or together with an auditory stimulus at different stimulus onset asynchronies (SOAs). In the first experiment, the sound was presented either simultaneously or before the visual stimulus (i.e., SOAs 0, 100, 250, 400 ms). The results show that the presence of a task-irrelevant auditory stimulus produced a significant visual detection enhancement in all the conditions. In the second experiment, the sound was either synchronized with, or randomly preceded/lagged behind the visual stimulus (i.e., SOAs 0, ± 250, ± 400 ms). The visual detection enhancement was reduced in magnitude and limited only to the synchronous condition and to the condition in which the sound stimulus was presented 250 ms before the visual stimulus. Taken together, the evidence of the present study seems to suggest that audiovisual interaction in low vision individuals is highly modulated by top-down mechanisms.


Neuroscience & Biobehavioral Reviews | 2014

Corrigendum to "Audiotactile interactions in front and rear space" [Neurosci. Biobehav. Rev. 35 (3) (2011) 589-598]

Valeria Occelli; Charles Spence; Massimiliano Zampini

The authors regret to inform that there is a correction in the abstract of the article. In the abstract, the sentence “In particular, these studies have focused on investigating peri-head space and, more recently, on the functional differences that have been demonstrated between the space close to front and back of the head (i.e., the peri-head space).” should be changed into “In particular, these studies have focused on investigating peri-hand space and, more recently, on the functional differences that have been demonstrated between the space close to front and back of the head (i.e., the peri-head space).”. The authors would like to apologise for any inconvenience caused.

Collaboration


Dive into the Valeria Occelli's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Gianluca Esposito

Nanyang Technological University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge