Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Charles Spence is active.

Publication


Featured researches published by Charles Spence.


Attention Perception & Psychophysics | 2011

Crossmodal correspondences: A tutorial review

Charles Spence

In many everyday situations, our senses are bombarded by many different unisensory signals at any given time. To gain the most veridical, and least variable, estimate of environmental stimuli/properties, we need to combine the individual noisy unisensory perceptual estimates that refer to the same object, while keeping those estimates belonging to different objects or events separate. How, though, does the brain “know” which stimuli to combine? Traditionally, researchers interested in the crossmodal binding problem have focused on the roles that spatial and temporal factors play in modulating multisensory integration. However, crossmodal correspondences between various unisensory features (such as between auditory pitch and visual size) may provide yet another important means of constraining the crossmodal binding problem. A large body of research now shows that people exhibit consistent crossmodal correspondences between many stimulus features in different sensory modalities. For example, people consistently match high-pitched sounds with small, bright objects that are located high up in space. The literature reviewed here supports the view that crossmodal correspondences need to be considered alongside semantic and spatiotemporal congruency, among the key constraints that help our brains solve the crossmodal binding problem.


Psychological Science | 2000

Visual Capture of Touch: Out-of-the-Body Experiences With Rubber Gloves

Francesco Pavani; Charles Spence; Jon Driver

When the apparent visual location of a body part conflicts with its veridical location, vision can dominate proprioception and kinesthesia. In this article, we show that vision can capture tactile localization. Participants discriminated the location of vibrotactile stimuli (upper, at the index finger, vs. lower, at the thumb), while ignoring distractor lights that could independently be upper or lower. Such tactile discriminations were slowed when the distractor light was incongruent with the tactile target (e.g., an upper light during lower touch) rather than congruent, especially when the lights appeared near the stimulated hand. The hands were occluded under a table, with all distractor lights above the table. The effect of the distractor lights increased when rubber hands were placed on the table, “holding” the distractor lights, but only when the rubber hands were spatially aligned with the participants own hands. In this aligned situation, participants were more likely to report the illusion of feeling touch at the rubber hands. Such visual capture of touch appears cognitively impenetrable.


Attention Perception & Psychophysics | 1997

Audiovisual links in exogenous covert spatial orienting

Charles Spence; Jon Driver

Subjects judged the elevation (up vs. down, regardless of laterality) of peripheral auditory or visual targets, following uninformative cues on either side with an intermediate elevation. Judgments were better for targets in either modality when preceded by an uninformative auditory cue on the side of the target. Experiment 2 ruled out nonattentional accounts for these spatial cuing effects. Experiment 3 found that visual cues affected elevation judgments for visual but not auditory targets. Experiment 4 confirmed that the effect on visual targets was attentional. In Experiment 5, visual cues produced spatial cuing when targets were always auditory, but saccades toward the cue may have been responsible. No such visual-to-auditory cuing effects were found in Experiment 6 when saccades were prevented, though they were present when eye movements were not monitored. These results suggest a one-way cross-modal dependence in exogenous covert orienting whereby audition influences vision, but not vice versa. Possible reasons for this asymmetry are discussed in terms of the representation of space within the brain.


Current Biology | 2000

Multisensory perception: Beyond modularity and convergence

Jon Driver; Charles Spence

Recent research on multisensory perception suggests a number of general principles for crossmodal integration and that the standard model in the field--feedforward convergence of information--must be modified to include a role for feedback projections from multimodal to unimodal brain areas.


Journal of Experimental Psychology: General | 2001

Multisensory prior entry

Charles Spence; David I. Shore; Raymond M. Klein

Despite 2 centuries of research, the question of whether attending to a sensory modality speeds the perception of stimuli in that modality has yet to be resolved. The authors highlight weaknesses inherent in this previous research and report the results of 4 experiments in which a novel methodology was used to investigate the effects on temporal order judgments (TOJs) of attending to a particular sensory modality or spatial location. Participants were presented with pairs of visual and tactile stimuli from the left and/or right at varying stimulus onset asynchronies and were required to make unspeeded TOJs regarding which stimulus appeared first. The results provide the strongest evidence to date for the existence of multisensory prior entry and support previous claims for attentional biases toward the visual modality and toward the right side of space. These findings have important implications for studies in many areas of human and animal cognition.


Trends in Cognitive Sciences | 1998

Attention and the crossmodal construction of space

Jon Driver; Charles Spence

Traditional studies of spatial attention consider only a single sensory modality at a time (e.g. just vision, or just audition). In daily life, however, our spatial attention often has to be coordinated across several modalities. This is a non-trivial problem, given that each modality initially codes space in entirely different ways. In the last five years, there has been a spate of studies on crossmodal attention. These have demonstrated numerous crossmodal links in spatial attention, such that attending to a particular location in one modality tends to produce corresponding shifts of attention in other modalities. The spatial coordinates of these crossmodal links illustrate that the internal representation of external space depends on extensive crossmodal integration. Recent neuroscience studies are discussed that suggest possible brain mechanisms for the crossmodal links in spatial attention.


Cognitive Processing | 2004

The body schema and multisensory representation(s) of peripersonal space

Nicholas P. Holmes; Charles Spence

To guide the movement of the body through space, the brain must constantly monitor the position and movement of the body in relation to nearby objects. The effective piloting of the body to avoid or manipulate objects in pursuit of behavioural goals requires an integrated neural representation of the body (the ‘body schema’) and of the space around the body (‘peripersonal space’). In the review that follows, we describe and evaluate recent results from neurophysiology, neuropsychology, and psychophysics in both human and non-human primates that support the existence of an integrated representation of visual, somatosensory, and auditory peripersonal space. Such a representation involves primarily visual, somatosensory, and proprioceptive modalities, operates in body-part-centred reference frames, and demonstrates significant plasticity. Recent research shows that the use of tools, the viewing of one’s body or body parts in mirrors, and in video monitors, may also modulate the visuotactile representation of peripersonal space.


Attention Perception & Psychophysics | 2001

The cost of expecting events in the wrong sensory modality.

Charles Spence; Michael E. R. Nicholls; J. O. N. Driver

We examined the effects of modality expectancy on human performance. Participants judged azimuth (left vs. right location) for an unpredictable sequence of auditory, visual, and tactile targets. In some blocks, equal numbers of targets were presented in each modality. In others, the majority (75%) of the targets were presented in just one expected modality. Reaction times (RTs) for targets in an unexpected modality were slower than when that modality was expected or when no expectancy applied. RT costs associated with shifting attention from the tactile modality were greater than those for shifts from either audition or vision. Any RT benefits for the most likely modality were due to priming from an event in the same modality on the previous trial, not to the expectancy per se. These results show that stimulus-driven and expectancy-driven effects must be distinguished in studies of attending to different sensory modalities.


Proceedings of the National Academy of Sciences of the United States of America | 2008

Psychologically induced cooling of a specific body part caused by the illusory ownership of an artificial counterpart

G. Lorimer Moseley; Nick Olthof; Annemeike Venema; Sanneke Don; Marijke Wijers; Alberto Gallace; Charles Spence

The sense of body ownership represents a fundamental aspect of our self-awareness, but is disrupted in many neurological, psychiatric, and psychological conditions that are also characterized by disruption of skin temperature regulation, sometimes in a single limb. We hypothesized that skin temperature in a specific limb could be disrupted by psychologically disrupting the sense of ownership of that limb. In six separate experiments, and by using an established protocol to induce the rubber hand illusion, we demonstrate that skin temperature of the real hand decreases when we take ownership of an artificial counterpart. The decrease in skin temperature is limb-specific: it does not occur in the unstimulated hand, nor in the ipsilateral foot. The effect is not evoked by tactile or visual input per se, nor by simultaneous tactile and visual input per se, nor by a shift in attention toward the experimental side or limb. In fact, taking ownership of an artificial hand slows tactile processing of information from the real hand, which is also observed in patients who demonstrate body disownership after stroke. These findings of psychologically induced limb-specific disruption of temperature regulation provide the first evidence that: taking ownership of an artificial body part has consequences for the real body part; that the awareness of our physical self and the physiological regulation of self are closely linked in a top-down manner; and that cognitive processes that disrupt the sense of body ownership may in turn disrupt temperature regulation in numerous states characterized by both.


Consciousness and Cognition | 2008

The multisensory perception of flavor.

Malika Auvray; Charles Spence

Following on from ecological theories of perception, such as the one proposed by [Gibson, J. J. (1966). The senses considered as perceptual systems. Boston: Houghton Mifflin] this paper reviews the literature on the multisensory interactions underlying the perception of flavor in order to determine the extent to which it is really appropriate to consider flavor perception as a distinct perceptual system. We propose that the multisensory perception of flavor may be indicative of the fact that the taxonomy currently used to define our senses is simply not appropriate. According to the view outlined here, the act of eating allows the different qualities of foodstuffs to be combined into unified percepts; and flavor can be used as a term to describe the combination of tastes, smells, trigeminal, and tactile sensations as well as the visual and auditory cues, that we perceive when tasting food.

Collaboration


Dive into the Charles Spence's collaboration.

Top Co-Authors

Avatar

Carlos Velasco

BI Norwegian Business School

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Betina Piqueras-Fiszman

Wageningen University and Research Centre

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jon Driver

University College London

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Salvador Soto-Faraco

University of British Columbia

View shared research outputs
Top Co-Authors

Avatar

Ophelia Deroy

School of Advanced Study

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge