Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Nicholas P. Holmes is active.

Publication


Featured researches published by Nicholas P. Holmes.


The Journal of Neuroscience | 2005

Touching a Rubber Hand: Feeling of Body Ownership Is Associated with Activity in Multisensory Brain Areas

H. Henrik Ehrsson; Nicholas P. Holmes; Richard E. Passingham

In the “rubber-hand illusion,” the sight of brushing of a rubber hand at the same time as brushing of the persons own hidden hand is sufficient to produce a feeling of ownership of the fake hand. We shown previously that this illusion is associated with activity in the multisensory areas, most notably the ventral premotor cortex (Ehrsson et al., 2004). However, it remains to be demonstrated that this illusion does not simply reflect the dominant role of vision and that the premotor activity does not reflect a visual representation of an object near the hand. To address these issues, we introduce a somatic rubber-hand illusion. The experimenter moved the blindfolded participants left index finger so that it touched the fake hand, and simultaneously, he touched the participants real right hand, synchronizing the touches as perfectly as possible. After ∼9.7 s, this stimulation elicited an illusion that one was touching ones own hand. We scanned brain activity during this illusion and two control conditions, using functional magnetic resonance imaging. Activity in the ventral premotor cortices, intraparietal cortices, and the cerebellum was associated with the illusion of touching ones own hand. Furthermore, the rated strength of the illusion correlated with the degree of premotor and cerebellar activity. This finding suggests that the activity in these areas reflects the detection of congruent multisensory signals from ones own body, rather than of visual representations. We propose that this could be the mechanism for the feeling of body ownership.


Cognitive Processing | 2004

The body schema and multisensory representation(s) of peripersonal space

Nicholas P. Holmes; Charles Spence

To guide the movement of the body through space, the brain must constantly monitor the position and movement of the body in relation to nearby objects. The effective piloting of the body to avoid or manipulate objects in pursuit of behavioural goals requires an integrated neural representation of the body (the ‘body schema’) and of the space around the body (‘peripersonal space’). In the review that follows, we describe and evaluate recent results from neurophysiology, neuropsychology, and psychophysics in both human and non-human primates that support the existence of an integrated representation of visual, somatosensory, and auditory peripersonal space. Such a representation involves primarily visual, somatosensory, and proprioceptive modalities, operates in body-part-centred reference frames, and demonstrates significant plasticity. Recent research shows that the use of tools, the viewing of one’s body or body parts in mirrors, and in video monitors, may also modulate the visuotactile representation of peripersonal space.


The Journal of Neuroscience | 2007

Is That Near My Hand? Multisensory Representation of Peripersonal Space in Human Intraparietal Sulcus

Tamar R. Makin; Nicholas P. Holmes; Ehud Zohary

Our ability to interact with the immediate surroundings depends not only on an adequate representation of external space but also on our ability to represent the location of objects with respect to our own body and especially to our hands. Indeed, electrophysiological studies in monkeys revealed multimodal neurons with spatially corresponding tactile and visual receptive fields in a number of brain areas, suggesting a representation of visual peripersonal space with respect to the body. In this functional magnetic resonance imaging study, we localized areas in human intraparietal sulcus (IPS) and lateral occipital complex (LOC) that represent nearby visual space with respect to the hands (perihand space), by contrasting the response to a ball moving near-to versus far-from the hands. Furthermore, by independently manipulating sensory information about the hand, in the visual (using a dummy hand) and proprioceptive domains (by changing the unseen hand position), we determined the sensory contributions to the representation of hand-centered space. In the posterior IPS, the visual contribution was dominant, overriding proprioceptive information. Surprisingly, regions within LOC also displayed visually dominant, hand-related activation. In contrast, the anterior IPS was characterized by a proprioceptive representation of the hand, as well as showing tactile hand-specific activation, suggesting a homology with monkey parietal hand-centered areas. We therefore suggest that, whereas cortical regions within the posterior IPS and LOC represent hand-centered space in a predominantly visual manner, the anterior IPS uses multisensory information in representing perihand space.


Attention Perception & Psychophysics | 2006

Reaching with alien limbs: Visual exposure to prosthetic hands in a mirror biases proprioception without accompanying illusions of ownership

Nicholas P. Holmes; Hendrikus J. Snijders; Charles Spence

In five experiments, we investigated the effects of visual exposure to a real hand, a rubber hand, or a wooden block on reaching movements made with the unseen left hand behind a parasagittal mirror. Participants reached from one of four starting positions, corresponding to four levels of conflict between the proprioceptively and visually specified positions of the reaching hand. Reaching movements were affected most by exposure to the real hand, intermediately by the rubber hand, and least of all by the wooden block. When the posture and/or movement of the visible hand was incompatible with that of the reaching hand, the effect on reaching was reduced. A “rubber hand illusion” questionnaire revealed that illusions of ownership of the rubber hand were not strongly correlated with reaching performance. This research suggests that proprioception is recalibrated following visual exposure to prosthetic hands and that this recalibration is independent of the rubber hand illusion.


Current Biology | 2005

Multisensory integration: space, time and superadditivity.

Nicholas P. Holmes; Charles Spence

The superior colliculus generates and controls eye and head movements based on signals from different senses. The latest research on this structure enhances our understanding of the mechanisms of multisensory integration in the brain.


Cognitive, Affective, & Behavioral Neuroscience | 2004

When mirrors lie: "visual capture" of arm position impairs reaching performance.

Nicholas P. Holmes; Gemma Crozier; Charles Spence

If we stand at a mirror’s edge, we can see one half of our body reflected in the mirror, as if it were the other half of our body, seen “through” the mirror. We used this mirror illusion to examine the effect of conflicts between visually and proprioceptively specified arm positions on subsequent reaching movements made with the unseen right arm. When participants viewed their static left arm in the mirror (i.e., as if it were their right arm), subsequent right-arm reaching movements were affected significantly more when there was conflict between the apparent visual and the proprioceptively specified right-arm positions than when there was no conflict. This result demonstrates that visual capture of arm position can occur when individual body parts are viewed in the mirror and that this capture has a measurable effect on subsequent reaching movements made with an unseen arm. The result has implications for how the brain represents the body across different sensory modalities.


The Journal of Neuroscience | 2009

Coding of Visual Space during Motor Preparation: Approaching Objects Rapidly Modulate Corticospinal Excitability in Hand-Centered Coordinates

Tamar R. Makin; Nicholas P. Holmes; Claudio Brozzoli; Yves Rossetti; Alessandro Farnè

Defensive behaviors, such as withdrawing your hand to avoid potentially harmful approaching objects, rely on rapid sensorimotor transformations between visual and motor coordinates. We examined the reference frame for coding visual information about objects approaching the hand during motor preparation. Subjects performed a simple visuomanual task while a task-irrelevant distractor ball rapidly approached a location either near to or far from their hand. After the distractor ball appearance, single pulses of transcranial magnetic stimulation were delivered over the subjects primary motor cortex, eliciting motor evoked potentials (MEPs) in their responding hand. MEP amplitude was reduced when the ball approached near the responding hand, both when the hand was on the left and the right of the midline. Strikingly, this suppression occurred very early, at 70–80 ms after ball appearance, and was not modified by visual fixation location. Furthermore, it was selective for approaching balls, since static visual distractors did not modulate MEP amplitude. Together with additional behavioral measurements, we provide converging evidence for automatic hand-centered coding of visual space in the human brain.


Neuropsychologia | 2014

Effects of action observation on corticospinal excitability: Muscle specificity, direction, and timing of the mirror response.

Katherine R. Naish; Carmel Houston-Price; Andrew J. Bremner; Nicholas P. Holmes

Many human behaviours and pathologies have been attributed to the putative mirror neuron system, a neural system that is active during both the observation and execution of actions. While there are now a very large number of papers on the mirror neuron system, variations in the methods and analyses employed by researchers mean that the basic characteristics of the mirror response are not clear. This review focuses on three important aspects of the mirror response, as measured by modulations in corticospinal excitability: (1) muscle specificity; (2) direction; and (3) timing of modulation. We focus mainly on electromyographic (EMG) data gathered following single-pulse transcranial magnetic stimulation (TMS), because this method provides precise information regarding these three aspects of the response. Data from paired-pulse TMS paradigms and peripheral nerve stimulation (PNS) are also considered when we discuss the possible mechanisms underlying the mirror response. In this systematic review of the literature, we examine the findings of 85 TMS and PNS studies of the human mirror response, and consider the limitations and advantages of the different methodological approaches these have adopted in relation to discrepancies between their findings. We conclude by proposing a testable model of how action observation modulates corticospinal excitability in humans. Specifically, we propose that action observation elicits an early, non-specific facilitation of corticospinal excitability (at around 90ms from action onset), followed by a later modulation of activity specific to the muscles involved in the observed action (from around 200ms). Testing this model will greatly advance our understanding of the mirror mechanism and provide a more stable grounding on which to base inferences about its role in human behaviour.


Neuropsychologia | 2007

The law of inverse effectiveness in neurons and behaviour: Multisensory integration versus normal variability

Nicholas P. Holmes

Multisensory research is often interpreted according to three rules: the spatial rule, the temporal rule, and the law of inverse effectiveness. The spatial and temporal rules state that multisensory stimuli are integrated when their environmental sources occur at similar locations and times, respectively. The law of inverse effectiveness states that multisensory stimuli are integrated inversely proportional to the effectiveness of the best unisensory response. Neurally, these rules are grounded in anatomical and physiological mechanisms. By contrast, behavioural evidence often contradicts these rules, and direct links between multisensory neurons and multisensory behaviour remain unclear. This note discusses evidence supporting the law of inverse effectiveness, and reports a simulation of a behavioural experiment recently published in Neuropsychologia. The simulation reveals an alternative, statistical, explanation for the data. I conclude that the law of inverse effectiveness only sometimes applies, and that the choice of statistical analysis can have profound effects on whether the data abides the law.


Experimental Brain Research | 2007

Tool use changes multisensory interactions in seconds: evidence from the crossmodal congruency task

Nicholas P. Holmes; Gemma A. Calvert; Charles Spence

Active tool use in human and non-human primates has been claimed to alter the neural representations of multisensory peripersonal space. To date, most studies suggest that a short period of tool use leads to an expansion or elongation of these spatial representations, which lasts several minutes after the last tool use action. However, the possibility that multisensory interactions also change on a much shorter time scale following or preceding individual tool use movements has not yet been investigated. We measured crossmodal (visual-tactile) congruency effects as an index of multisensory integration during two tool use tasks. In the regular tool use task, the participants used one of two tools in a spatiotemporally predictable sequence after every fourth crossmodal congruency trial. In the random tool use task, the required timing and spatial location of the tool use task varied unpredictably. Multisensory integration effects increased as a function of the number of trials since tool use in the regular tool use group, but remained relatively constant in the random tool use group. The spatial distribution of these multisensory effects, however, was unaffected by tool use predictability, with significant spatial interactions found only near the hands and at the tips of the tools. These data suggest that endogenously preparing to use a tool enhances visual-tactile interactions near the tools. Such enhancements are likely due to the increased behavioural relevance of visual stimuli as each tool use action is prepared before execution.

Collaboration


Dive into the Nicholas P. Holmes's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Tamar R. Makin

Hebrew University of Jerusalem

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Flavie Waters

University of Western Australia

View shared research outputs
Researchain Logo
Decentralizing Knowledge