Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Steffan Kennett is active.

Publication


Featured researches published by Steffan Kennett.


Current Biology | 2001

Noninformative vision improves the spatial resolution of touch in humans

Steffan Kennett; Marisa Taylor-Clarke; Patrick Haggard

Research on sensory perception now often considers more than one sense at a time. This approach reflects real-world situations, such as when a visible object touches us. Indeed, vision and touch show great interdependence: the sight of a body part can reduce tactile target detection times [1], visual and tactile attentional systems are spatially linked [2], and the texture of surfaces that are actively touched with the fingertips is perceived using both vision and touch [3]. However, these previous findings might be mediated by spatial attention [1, 2] or by improved guidance of movement [3] via visually enhanced body position sense [4--6]. Here, we investigate the direct effects of viewing the body on passive touch. We measured tactile two-point discrimination thresholds [7] on the forearm while manipulating the visibility of the arm but holding gaze direction constant. The spatial resolution of touch was better when the arm was visible than when it was not. Tactile performance was further improved when the view of the arm was magnified. In contrast, performance was not improved by viewing a neutral object at the arms location, ruling out improved spatial orienting as a possible account. Controls confirmed that no information about the tactile stimulation was provided by visibility of the arm. This visual enhancement of touch may point to online reorganization of tactile receptive fields.


Current Biology | 2002

Vision modulates somatosensory cortical processing

Marisa Taylor-Clarke; Steffan Kennett; Patrick Haggard

Over 150 years ago, E.H. Weber declared that experience showed that tactile acuity was not affected by viewing the stimulated body part. However, more recent investigations suggest that cross-modal links do exist between the senses. Viewing the stimulated body site improves performance on tactile discrimination and detection tasks and enhances tactile acuity. Here, we show that vision modulates somatosensory cortex activity, as measured by somatosensory event-related potentials (ERPs). This modulation is greatest when tactile stimulation is task relevant. Visual modulation is not present in the P50 component reflecting the primary afferent input to the cortex but appears in the subsequent N80 component, which has also been localized to SI, the primary somatosensory cortex. Furthermore, we replicate previous findings that noninformative vision improves spatial acuity. These results are consistent with a hypothesis that vision modulates cortical processing of tactile stimuli via back projections from multimodal cortical areas. Several neurophysiological studies suggest that primary and secondary somatosensory cortex (SI and SII, respectively) activity can be modulated by spatial and tactile attention and by visual cues. To our knowledge, this is the first demonstration of direct modulation of somatosensory cortex activity by a noninformative view of the stimulated body site with concomitant enhancement of tactile acuity in normal subjects.


Cognition | 2002

Tool-use changes multimodal spatial interactions between vision and touch in normal humans

Angelo Maravita; Charles Spence; Steffan Kennett; Jon Driver

In a visual-tactile interference paradigm, subjects judged whether tactile vibrations arose on a finger or thumb (upper vs. lower locations), while ignoring distant visual distractor lights that also appeared in upper or lower locations. Incongruent visual distractors (e.g. a lower light combined with upper touch) disrupt such tactile judgements, particularly when appearing near the tactile stimulus (e.g. on the same side of space as the stimulated hand). Here we show that actively wielding tools can change this pattern of crossmodal interference. When such tools were held in crossed positions (connecting the left hand to the right visual field, and vice-versa), the spatial constraints on crossmodal interference reversed, so that visual distractors in the other visual field now disrupted tactile judgements most for a particular hand. This phenomenon depended on active tool-use, developing with increased experience in using the tool. We relate these results to recent physiological and neuropsychological findings.


Journal of Cognitive Neuroscience | 2001

Tactile–Visual Links in Exogenous Spatial Attention under Different Postures: Convergent Evidence from Psychophysics and ERPs

Steffan Kennett; Martin Eimer; Charles Spence; Jon Driver

Tactile-visual links in spatial attention were examined by presenting spatially nonpredictive tactile cues to the left or right hand, shortly prior to visual targets in the left or right hemifield. To examine the spatial coordinates of any cross-modal links, different postures were examined. The hands were either uncrossed, or crossed so that the left hand lay in the right visual field and vice versa. Visual judgments were better on the side where the stimulated hand lay, though this effect was somewhat smaller with longer intervals between cue and target, and with crossed hands. Event-related brain potentials (ERPs) showed a similar pattern. Larger amplitude occipital N1 components were obtained for visual events on the same side as the preceding tactile cue, at ipsilateral electrode sites. Negativities in the Nd2 interval at midline and lateral central sites, and in the Nd1 interval at electrode Pz, were also enhanced for the cued side. As in the psychophysical results, ERP cueing effects during the crossed posture were determined by the side of space in which the stimulated hand lay, not by the anatomical side of the initial hemispheric projection for the tactile cue. These results demonstrate that crossmodal links in spatial attention can influence sensory brain responses as early as the N1, and that these links operate in a spatial frame-of-reference that can remap between the modalities across changes in posture.


Attention Perception & Psychophysics | 2002

Visuo-tactile links in covert exogenous spatial attention remap across changes in unseen hand posture.

Steffan Kennett; Charles Spence; Jon Driver

We investigated the effect of unseen hand posture on cross-modal, visuo-tactile links in covert spatial attention. In Experiment 1, a spatially nonpredictive visual cue was presented to the left or right hemifield shortly before a tactile target on either hand. To examine the spatial coordinates of any crossmodal cuing, the unseen hands were either uncrossed or crossed so that the left hand lay to the right and vice versa. Tactile up/down (i.e., index finger/thumb) judgments were better on the same side of external space as the visual cue, for both crossed and uncrossed postures. Thus, which hand was advantaged by a visual cue in a particular hemifield reversed across the different unseen postures. In Experiment 2, nonpredictive tactile cues now preceded visual targets. Up/down judgments for the latter were better on the same side of external space as the tactile cue, again for both postures. These results demonstrate cross-modal links between vision and touch in exogenous covert spatial attention that remap across changes in unseen hand posture, suggesting a modulatory role for proprioception.


Experimental Brain Research | 2004

Visual enhancement of touch in spatial body representation.

Clare Press; Marisa Taylor-Clarke; Steffan Kennett; Patrick Haggard

Perception of our own bodies is based on integration of visual and tactile inputs, notably by neurons in the brains parietal lobes. Here we report a behavioural consequence of this integration process. Simply viewing the arm can speed up reactions to an invisible tactile stimulus on the arm. We observed this visual enhancement effect only when a tactile task required spatial computation within a topographic map of the body surface and the judgements made were close to the limits of performance. This effect of viewing the body surface was absent or reversed in tasks that either did not require a spatial computation or in which judgements were well above performance limits. We consider possible mechanisms by which vision may influence tactile processing.


Current Biology | 2003

Tactile perception, cortical representation and the bodily self

Patrick Haggard; Marisa Taylor-Clarke; Steffan Kennett

The human sense of touch relies on a specialised neural system for transmission of afferent input. Afferent inputs to primary somatosensory cortex result in tactile perceptions, characterised by their phenomenological vividness. Epistemologists have found these perceptions intriguing because they appear to be private: another person cannot know what I am feeling, because only ‘I’ am connected to my tactile receptors. However, epistemologists have confused the correct close connection of touch with the bodily self, with the incorrect idea of tactile perception as a raw sense datum. Neuroscience can clarify how tactile perception contributes to a conscious sense of the bodily self, by describing the neural mechanisms underlying the mutual and interactive relation between primary tactile perception and higher cortical representations of the body. Even highly abstract cognitive representations, such as ‘self’, may be understood in terms of their sensorimotor bases.Baumeister, 1999xBaumeister, R.F. See all References, Head and Holmes, 1911xSensory disturbances from cerebral lesions. Head, H. and Holmes, G. Brain. 1911; 34: 102–254CrossrefSee all References, Merzenich et al., 1984xSomatosensory cortical map changes following digit amputation in adult monkeys. Merzenich, M.M., Nelson, R.J., Stryker, M.P., Cynader, M.S., Schoppmann, A., and Zook, J.M. J. Comp. Neurol. 1984; 224: 591–605Crossref | PubMedSee all References, Pavani et al., 2000xVisual capture of touch: Out-of-the-body experiences with rubber gloves. Pavani, F., Spence, C., and Driver, J. Psychol. Sci. 2000; 11: 353–359Crossref | PubMedSee all References, Ramachandran and Hirstein, 1998xThe perception of phantom limbs - The D.O. Hebb lecture. Ramachandran, V.S. and Hirstein, W. Brain. 1998; 121: 1603–1630Crossref | PubMed | Scopus (564)See all References, Taylor-Clarke et al., 2002xVision modulates somatosensory cortical processing. Taylor-Clarke, M., Kennett, S., and Haggard, P. Curr. Biol. 2002; 12: 233–236Abstract | Full Text | Full Text PDF | PubMed | Scopus (176)See all References, Yamamoto and Kitazawa, 2001xSensation at the tips of invisible tools. Yamamoto, S. and Kitazawa, S. Nat. Neurosci. 2001; 4: 979–980Crossref | PubMed | Scopus (79)See all References


Neuroscience Letters | 2004

Persistence of visual–tactile enhancement in humans

Marisa Taylor-Clarke; Steffan Kennett; Patrick Haggard

We report two experiments in which non-informative vision of the finger enhanced tactile acuity on the fingertip. The right index finger was passively lifted to contact a grating. Twelve participants judged orientations of tactile gratings while viewing either the fingertip, or a neutral object presented via a mirror at the fingertips location. In Expt. 1, tactile orientation discrimination for near-threshold gratings was improved when viewing the fingertip, compared to viewing the neutral object. Experiment 2 examined the temporal persistence of this effect, and found significant visual-tactile enhancement when a dark interval of up to 10 s intervened between viewing the finger and tactile stimulation. These results suggest that viewing the body modulates the neural circuitry of primary somatosensory cortex, outlasting visual inputs.


Journal of Neurophysiology | 2012

Attentional selection of location and modality in vision and touch modulates low-frequency activity in associated sensory cortices

Markus Bauer; Steffan Kennett; Jon Driver

Selective attention allows us to focus on particular sensory modalities and locations. Relatively little is known about how attention to a sensory modality may relate to selection of other features, such as spatial location, in terms of brain oscillations, although it has been proposed that low-frequency modulation (α- and β-bands) may be key. Here, we investigated how attention to space (left or right) and attention to modality (vision or touch) affect ongoing low-frequency oscillatory brain activity over human sensory cortex. Magnetoencephalography was recorded while participants performed a visual or tactile task. In different blocks, touch or vision was task-relevant, whereas spatial attention was cued to the left or right on each trial. Attending to one or other modality suppressed α-oscillations over the corresponding sensory cortex. Spatial attention led to reduced α-oscillations over both sensorimotor and occipital cortex contralateral to the attended location in the cue-target interval, when either modality was task-relevant. Even modality-selective sensors also showed spatial-attention effects for both modalities. The visual and sensorimotor results were generally highly convergent, yet, although attention effects in occipital cortex were dominant in the α-band, in sensorimotor cortex, these were also clearly present in the β-band. These results extend previous findings that spatial attention can operate in a multimodal fashion and indicate that attention to space and modality both rely on similar mechanisms that modulate low-frequency oscillations.


Perception | 2004

Facilitated Processing of Visual Stimuli Associated with the Body

Louise Whiteley; Steffan Kennett; Marisa Taylor-Clarke; Patrick Haggard

Recent work on tactile perception has revealed enhanced tactile acuity and speeded spatial-choice reaction times (RTs) when viewing the stimulated body site as opposed to viewing a neutral object. Here we examine whether this body-view enhancement effect extends to visual targets. Participants performed a speeded spatial discrimination between two lights attached either to their own left index finger or to a wooden finger-shaped object, making a simple distal–proximal decision. We filmed either the finger-mounted or the object-mounted lights in separate experimental blocks and the live scene was projected onto a screen in front of the participants. Thus, participants responded to identical visual targets varying only in their context: on the body or not. Results revealed a large performance advantage for the finger-mounted stimuli: reaction times were substantially reduced, while discrimination accuracy was unaffected. With this finding we address concerns associated with previous work on the processing of stimuli attributed to the self and extend the finding of a performance advantage for such stimuli to vision.

Collaboration


Dive into the Steffan Kennett's collaboration.

Top Co-Authors

Avatar

Jon Driver

University College London

View shared research outputs
Top Co-Authors

Avatar

Patrick Haggard

University College London

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Chris Rorden

University of South Carolina

View shared research outputs
Top Co-Authors

Avatar

Markus Bauer

University of Nottingham

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge