Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Massimiliano Zampini is active.

Publication


Featured researches published by Massimiliano Zampini.


Neuron | 2009

Category-Specific Organization in the Human Brain Does Not Require Visual Experience

Bradford Z. Mahon; Stefano Anzellotti; Jens Schwarzbach; Massimiliano Zampini; Alfonso Caramazza

Distinct regions within the ventral visual pathway show neural specialization for nonliving and living stimuli (e.g., tools, houses versus animals, faces). The causes of these category preferences are widely debated. Using functional magnetic resonance imaging, we find that the same regions of the ventral stream that show category preferences for nonliving stimuli and animals in sighted adults show the same category preferences in adults who are blind since birth. Both blind and sighted participants had larger blood oxygen-level dependent (BOLD) responses in the medial fusiform gyrus for nonliving stimuli compared to animal stimuli and differential BOLD responses in lateral occipital cortex for animal stimuli compared to nonliving stimuli. These findings demonstrate that the medial-to-lateral bias by conceptual domain in the ventral visual pathway does not require visual experience in order to develop and suggest the operation of innately determined domain-specific constraints on the organization of object knowledge.


Attention Perception & Psychophysics | 2005

Audio-visual simultaneity judgments

Massimiliano Zampini; Steve Guest; David I. Shore; Charles Spence

The relative spatiotemporal correspondence between sensory events affects multisensory integration across a variety of species; integration is maximal when stimuli in different sensory modalities are presented from approximately the same position at about the same time. In the present study, we investigated the influence of spatial and temporal factors on audio-visual simultaneity perception in humans. Participants made unspeeded simultaneous versus successive discrimination responses to pairs of auditory and visual stimuli presented at varying stimulus onset asynchronies from either the same or different spatial positions using either the method of constant stimuli (Experiments 1 and 2) or psychophysical staircases (Experiment 3). The participants in all three experiments were more likely to report the stimuli as being simultaneous when they originated from the same spatial position than when they came from different positions, demonstrating that the apparent perception of multisensory simultaneity is dependent on the relative spatial position from which stimuli are presented.


Attention Perception & Psychophysics | 2003

Multisensory temporal order judgments: When two locations are better than one

Charles Spence; Roland Baddeley; Massimiliano Zampini; Robert James; David I. Shore

In Experiment 1, participants were presented with pairs of stimuli (one visual and the other tactile) from the left and/or right of fixation at varying stimulus onset asynchronies and were required to make unspeeded temporal order judgments (TOJs) regarding which modality was presented first. When the participants adopted an uncrossed-hands posture, just noticeable differences (JNDs) were lower (i.e., multisensory TOJs were more precise) when stimuli were presented from different positions, rather than from the same position. This spatial redundancy benefit was reduced when the participants adopted a crossed-hands posture, suggesting a failure to remap visuotactile space appropriately. In Experiment 2, JNDs were also lower when pairs of auditory and visual stimuli were presented from different positions, rather than from the same position. Taken together, these results demonstrate that people can use redundant spatial cues to facilitate their performance on multisensory TOJ tasks and suggest that previous studies may have systematically overestimated the precision with which people can make such judgments. These results highlight the intimate link between spatial and temporal factors in determining our perception of the multimodal objects and events in the world around us.


Neuroscience Letters | 2005

Audiovisual prior entry

Massimiliano Zampini; David I. Shore; Charles Spence

It is almost one hundred years since Titchener [E.B. Titchener, Lectures on the Elementary Psychology of Feeling and Attention, Macmillan, New York, 1908] published his influential claim that attending to a particular sensory modality (or location) can speed up the relative time of arrival of stimuli presented in that modality (or location). However, the evidence supporting the existence of prior entry is, to date, mixed. In the present study, we used an audiovisual simultaneity judgment task in an attempt to circumvent the potential methodological confounds inherent in previous research in this area. Participants made simultaneous versus successive judgment responses regarding pairs of auditory and visual stimuli at varying stimulus onset asynchronies (SOAs) using the method of constant stimuli. In different blocks of trials, the participants were instructed to attend either to the auditory or to the visual modality, or else to divide their attention equally between the two modalities. The probability of trials containing intramodal stimulus pairs (e.g., vision-vision or audition-audition) was increased in the focused attention blocks to encourage participants to follow the instructions. The perception of simultaneity was modulated by this attentional manipulation: Visual stimuli had to lead auditory stimuli by a significantly smaller interval for simultaneity to be perceived when attention was directed to vision than when it was directed to audition. These results provide the first unequivocal evidence for the existence of audiovisual prior entry.


Journal of Cognitive Neuroscience | 2007

Temporal Order is Coded Temporally in the Brain: Early Event-related Potential Latency Shifts Underlying Prior Entry in a Cross-modal Temporal Order Judgment Task

Jonas Vibell; Corinna Klinge; Massimiliano Zampini; Charles Spence; Anna C. Nobre

The speeding-up of neural processing associated with attended events (i.e., the prior-entry effect) has long been proposed as a viable mechanism by which attention can prioritize our perception and action. In the brain, this has been thought to be regulated through a sensory gating mechanism, increasing the amplitudes of early evoked potentials while leaving their latencies unaffected. However, the majority of previous research has emphasized speeded responding and has failed to emphasize fine temporal discrimination, thereby potentially lacking the sensitivity to reveal putative modulations in the timing of neural processing. In the present study, we used a cross-modal temporal order judgment task while shifting attention between the visual and tactile modalities to investigate the mechanisms underlying selective attention electrophysiologically. Our results indicate that attention can indeed speed up neural processes during visual perception, thereby providing the first electrophysiological support for the existence of prior entry.


Perception | 2007

The Role of Hand Size in the Fake-Hand Illusion Paradigm

Francesco Pavani; Massimiliano Zampini

When a hand (either real or fake) is stimulated in synchrony with our own hand concealed from view, the felt position of our own hand can be biased toward the location of the seen hand. This intriguing phenomenon relies on the brains ability to detect statistical correlations in the multisensory inputs (ie visual, tactile, and proprioceptive), but it is also modulated by the pre-existing representation of ones own body. Nonetheless, researchers appear to have accepted the assumption that the size of the seen hand does not matter for this illusion to occur. Here we used a real-time video image of the participants own hand to elicit the illusion, but we varied the hand size in the video image so that the seen hand was either reduced, veridical, or enlarged in comparison to the participants own hand. The results showed that visible-hand size modulated the illusion, which was present for veridical and enlarged images of the hand, but absent when the visible hand was reduced. These findings indicate that very specific aspects of our own body image (ie hand size) can constrain the multisensory modulation of the body schema highlighted by the fake-hand illusion paradigm. In addition, they suggest an asymmetric tendency to acknowledge enlarged (but not reduced) images of body parts within our body representation.


Neuropsychologia | 2007

Auditory-somatosensory multisensory interactions in front and rear space

Massimiliano Zampini; Diego Torresan; Charles Spence; Micah M. Murray

The information conveyed by our senses can be combined to facilitate perception and behaviour. One focus of recent research has been on the factors governing such facilitatory multisensory interactions. The spatial register of neuronal receptive fields (RFs) appears to be a prerequisite for multisensory enhancement. In terms of auditory-somatosensory (AS) interactions, facilitatory effects on simple reaction times and on brain responses have been demonstrated in caudo-medial auditory cortices, both when auditory and somatosensory stimuli are presented to the same spatial location and also when they are separated by 100 degrees in frontal space. One implication is that these brain regions contain large spatial RFs. The present study further investigated this possibility and, in particular, the question of whether AS interactions are restricted to frontal space, since recent research has revealed some fundamental differences between the sensory processing of stimuli in front and rear space. Twelve participants performed a simple reaction time task to auditory, somatosensory, or simultaneous auditory-somatosensory stimuli. The participants placed one of their arms in front of them and the other behind their backs. Loudspeakers were placed close to each hand. Thus, there were a total of eight stimulus conditions - four unisensory and four multisensory - including all possible combinations of posture and loudspeaker location. A significant facilitation of reaction times (RTs), exceeding that predicted by probability summation, was obtained following multisensory stimulation, irrespective of whether the stimuli were in spatial register or not. These results are interpreted in terms of the likely RF organization of previously identified auditory-somatosensory brain regions.


Neurocase | 2004

Changes in spatial position of hands modify tactile extinction but not disownership of contralesional hand in two right brain-damaged patients

Valentina Moro; Massimiliano Zampini; Salvatore Maria Aglioti

Abstract Somatic misperceptions and misrepresentations, like supernumerary phantom limb and denial of ownership of a given body part, have typically been reported following damage to the right side of the brain. These symptoms typically occur with personal or extrapersonal neglect and extinction of left-sided stimuli, suggesting that all these different symptoms may be linked to the same neural substrate. In the present research, we tested two right brain-damaged (RBD) patients to find out whether changing the position of the hands in space influences tactile extinction and denial of ownership to the same extent. Results showed that manipulation of the spatial position of the hands reduces tactile extinction but leaves denial of ownership of the left hand unaffected. Such a dissociation suggests that delusional misperceptions may be independent from somatic neglect and that representation of hands in space and attribution of ownership are dynamically mapped in at least partly separate neural substrates.


Neuropsychologia | 2009

Auditory-somatosensory multisensory interactions are spatially modulated by stimulated body surface and acoustic spectra

Ana Tajadura-Jiménez; Norimichi Kitagawa; Aleksander Väljamäe; Massimiliano Zampini; Micah M. Murray; Charles Spence

Previous research has provided inconsistent results regarding the spatial modulation of auditory-somatosensory interactions. The present study reports three experiments designed to investigate the nature of these interactions in the space close to the head. Human participants made speeded detection responses to unimodal auditory, somatosensory, or simultaneous auditory-somatosensory stimuli. In Experiment 1, electrocutaneous stimuli were presented to either earlobe, while auditory stimuli were presented from the same versus opposite sides, and from one of two distances (20 vs. 70 cm) from the participants head. The results demonstrated a spatial modulation of auditory-somatosensory interactions when auditory stimuli were presented from close to the head. In Experiment 2, electrocutaneous stimuli were delivered to the hands, which were placed either close to or far from the head, while the auditory stimuli were again presented at one of two distances. The results revealed that the spatial modulation observed in Experiment 1 was specific to the particular body part stimulated (head) rather than to the region of space (i.e. around the head) where the stimuli were presented. The results of Experiment 3 demonstrate that sounds that contain high-frequency components are particularly effective in eliciting this auditory-somatosensory spatial effect. Taken together, these findings help to resolve inconsistencies in the previous literature and suggest that auditory-somatosensory multisensory integration is modulated by the stimulated body surface and acoustic spectra of the stimuli presented.


Neuroscience & Biobehavioral Reviews | 2011

Audiotactile interactions in front and rear space.

Valeria Occelli; Charles Spence; Massimiliano Zampini

The last few years have seen a growing interest in the assessment of audiotactile interactions in information processing in peripersonal space. In particular, these studies have focused on investigating peri-hand space [corrected] and, more recently, on the functional differences that have been demonstrated between the space close to front and back of the head (i.e., the peri-head space). In this review, the issue of how audiotactile interactions vary as a function of the region of space in which stimuli are presented (i.e., front vs. rear, peripersonal vs. extra-personal) will be described. We review evidence from both monkey and human studies. This evidence, providing insight into the differential attributes qualifying the frontal and the rear regions of space, sheds light on an until now neglected research topic and may help to contribute to the formulation of new rehabilitative approaches to disorders of spatial representation. A tentative explanation of the evolutionary reasons underlying these particular patterns of results, as well as suggestions for possible future developments, are also provided.

Collaboration


Dive into the Massimiliano Zampini's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Alberto Gallace

University of Milano-Bicocca

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge