Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Neil W. Roach is active.

Publication


Featured researches published by Neil W. Roach.


Proceedings of the Royal Society of London B: Biological Sciences | 2006

Resolving multisensory conflict: a strategy for balancing the costs and benefits of audio-visual integration

Neil W. Roach; James Heron; Paul V. McGraw

In order to maintain a coherent, unified percept of the external environment, the brain must continuously combine information encoded by our different sensory systems. Contemporary models suggest that multisensory integration produces a weighted average of sensory estimates, where the contribution of each system to the ultimate multisensory percept is governed by the relative reliability of the information it provides (maximum-likelihood estimation). In the present study, we investigate interactions between auditory and visual rate perception, where observers are required to make judgments in one modality while ignoring conflicting rate information presented in the other. We show a gradual transition between partial cue integration and complete cue segregation with increasing inter-modal discrepancy that is inconsistent with mandatory implementation of maximum-likelihood estimation. To explain these findings, we implement a simple Bayesian model of integration that is also able to predict observer performance with novel stimuli. The model assumes that the brain takes into account prior knowledge about the correspondence between auditory and visual rate signals, when determining the degree of integration to implement. This provides a strategy for balancing the benefits accrued by integrating sensory estimates arising from a common source, against the costs of conflating information relating to independent objects or events.


Perception | 2004

The tale is in the tail: An alternative hypothesis for psychophysical performance variability in dyslexia

Neil W. Roach; Veronica T. Edwards; John H. Hogben

Dyslexic groups have been reported to display poorer mean performance than groups of normal readers on a variety of psychophysical tasks. However, inspection of the distribution of individual scores for each group typically reveals that the majority of dyslexic observers actually perform within the normal range. Differences between group means often reflect the influence of a small number of dyslexic individuals who perform very poorly. While such findings are typically interpreted as evidence for specific perceptual deficiencies in dyslexia, caution in this approach is necessary. In this study we examined how general difficulties with task completion might manifest themselves in group psychophysical studies. Simulations of the effect of errant or inattentive trials on performance produced patterns of variability similar to those seen in dyslexic groups. Additionally, predicted relationships between the relative variability in dyslexic and control groups, and the magnitude of group differences bore close resemblance to the outcomes of a meta-analysis of empirical studies. These results suggest that general, nonsensory difficulties may underlie the poor performance of dyslexic groups on many psychophysical tasks. Implications and recommendations for future research are discussed.


Proceedings of the Royal Society of London B: Biological Sciences | 2011

Asynchrony adaptation reveals neural population code for audio-visual timing

Neil W. Roach; James Heron; David Whitaker; Paul V. McGraw

The relative timing of auditory and visual stimuli is a critical cue for determining whether sensory signals relate to a common source and for making inferences about causality. However, the way in which the brain represents temporal relationships remains poorly understood. Recent studies indicate that our perception of multisensory timing is flexible—adaptation to a regular inter-modal delay alters the point at which subsequent stimuli are judged to be simultaneous. Here, we measure the effect of audio-visual asynchrony adaptation on the perception of a wide range of sub-second temporal relationships. We find distinctive patterns of induced biases that are inconsistent with the previous explanations based on changes in perceptual latency. Instead, our results can be well accounted for by a neural population coding model in which: (i) relative audio-visual timing is represented by the distributed activity across a relatively small number of neurons tuned to different delays; (ii) the algorithm for reading out this population code is efficient, but subject to biases owing to under-sampling; and (iii) the effect of adaptation is to modify neuronal response gain. These results suggest that multisensory timing information is represented by a dedicated population code and that shifts in perceived simultaneity following asynchrony adaptation arise from analogous neural processes to well-known perceptual after-effects.


Experimental Brain Research | 2012

Audiovisual time perception is spatially specific.

James Heron; Neil W. Roach; James Vincent Michael Hanson; Paul V. McGraw; David Whitaker

Our sensory systems face a daily barrage of auditory and visual signals whose arrival times form a wide range of audiovisual asynchronies. These temporal relationships constitute an important metric for the nervous system when surmising which signals originate from common external events. Internal consistency is known to be aided by sensory adaptation: repeated exposure to consistent asynchrony brings perceived arrival times closer to simultaneity. However, given the diverse nature of our audiovisual environment, functionally useful adaptation would need to be constrained to signals that were generated together. In the current study, we investigate the role of two potential constraining factors: spatial and contextual correspondence. By employing an experimental design that allows independent control of both factors, we show that observers are able to simultaneously adapt to two opposing temporal relationships, provided they are segregated in space. No such recalibration was observed when spatial segregation was replaced by contextual stimulus features (in this case, pitch and spatial frequency). These effects provide support for dedicated asynchrony mechanisms that interact with spatially selective mechanisms early in visual and auditory sensory pathways.


Journal of Vision | 2008

Centrifugal propagation of motion adaptation effects across visual space

Paul V. McGraw; Neil W. Roach

Perceptual distortions induced by adaptation (aftereffects) arise through selective changes in the response properties of discrete subpopulations of neurons tuned to particular image features at the adapted spatial location. The systematic and well-documented increase of cortical receptive field sizes with eccentricity dictates that visual aftereffects ought to become less tightly tuned for location as stimuli are moved away from fixation. Here, we demonstrate that while this pattern holds for archetypal orientation and spatial frequency aftereffects, the effects of motion adaptation are characterized by precisely the opposite relationship. Surprisingly, adaptation to translational motion close to fixation induces distortions of perceived position and dynamic motion aftereffects that propagate centrifugally across visual space, resulting in a lack of location specificity. In contrast, motion adaptation in more peripheral locations produces aftereffects that are largely limited to the adapted spatial region. These findings suggest that central motion adaptation has the unique capacity to influence the response state of spatially distant neural populations that do not themselves encode the adapting stimulus.


European Journal of Neuroscience | 2010

Attention regulates the plasticity of multisensory timing

James Heron; Neil W. Roach; David Whitaker; James Vincent Michael Hanson

Evidence suggests than human time perception is likely to reflect an ensemble of recent temporal experience. For example, prolonged exposure to consistent temporal patterns can adaptively realign the perception of event order, both within and between sensory modalities (e.g. Fujisaki et al., 2004 Nat. Neurosci., 7, 773–778). In addition, the observation that ‘a watched pot never boils’ serves to illustrate the fact that dynamic shifts in our attentional state can also produce marked distortions in our temporal estimates. In the current study we provide evidence for a hitherto unknown link between adaptation, temporal perception and our attentional state. We show that our ability to use recent sensory history as a perceptual baseline for ongoing temporal judgments is subject to striking top‐down modulation via shifts in the observer’s selective attention. Specifically, attending to the temporal structure of asynchronous auditory and visual adapting stimuli generates a substantial increase in the temporal recalibration induced by these stimuli. We propose a conceptual framework accounting for our findings whereby attention modulates the perceived salience of temporal patterns. This heightened salience allows the formation of audiovisual perceptual ‘objects’, defined solely by their temporal structure. Repeated exposure to these objects induces high‐level pattern adaptation effects, akin to those found in visual and auditory domains (e.g. Leopold & Bondar (2005) Fitting the Mind to the World: Adaptation and Aftereffects in High‐Level Vision. Oxford University Press, Oxford, 189–211; Schweinberger et al. (2008) Curr. Biol., 18, 684–688).


Current Biology | 2011

Visual Motion Induces a Forward Prediction of Spatial Pattern

Neil W. Roach; Paul V. McGraw; Alan P Johnston

Summary Cortical motion analysis continuously encodes image velocity but might also be used to predict future patterns of sensory input along the motion path. We asked whether this predictive aspect of motion is exploited by the human visual system. Targets can be more easily detected at the leading as compared to the trailing edge of motion [1], but this effect has been attributed to a nonspecific boost in contrast gain at the leading edge, linked to motion-induced shifts in spatial position [1–4]. Here we show that the detectability of a local sinusoidal target presented at the ends of a region containing motion is phase dependent at the leading edge, but not at the trailing edge. These two observations rule out a simple gain control mechanism that modulates contrast energy and passive filtering explanations, respectively. By manipulating the relative orientation of the moving pattern and target, we demonstrate that the resulting spatial variation in detection threshold along the edge closely resembles the superposition of sensory input and an internally generated predicted signal. These findings show that motion induces a forward prediction of spatial pattern that combines with the cortical representation of the future stimulus.


Cognitive Neuropsychology | 2006

Psychophysical indices of perceptual functioning in dyslexia : A psychometric analysis

Steve M. Heath; Dorothy V. M. Bishop; John H. Hogben; Neil W. Roach

An influential causal theory attributes dyslexia to visual and/or auditory perceptual deficits. This theory derives from group differences between individuals with dyslexia and controls on a range of psychophysical tasks, but there is substantial variation, both between individuals within a group and from task to task. We addressed two questions. First, do psychophysical measures have sufficient reliability to assess perceptual deficits in individuals? Second, do different psychophysical tasks measure a common underlying construct? We studied 104 adults with a wide range of reading ability and two comparison groups of 49 dyslexic adults and 41 adults with normal reading, measuring performance on four auditory and two visual tasks. We observed moderate to high test–retest reliability for most tasks. While people with dyslexia were more likely to display poor task performance, we were unable to demonstrate either construct validity for any of the current theories of perceptual deficits or predictive validity for reading ability. We suggest that deficient perceptual task performance in dyslexia may be an associated (and inconsistent) marker of underlying neurological abnormality, rather than being causally implicated in reading difficulties.


Journal of Neurophysiology | 2009

Dynamics of Spatial Distortions Reveal Multiple Time Scales of Motion Adaptation

Neil W. Roach; Paul V. McGraw

Prolonged exposure to consistent visual motion can significantly alter the perceived direction and speed of subsequently viewed objects. These perceptual aftereffects have provided invaluable tools with which to study the mechanisms of motion adaptation and draw inferences about the properties of underlying neural populations. Behavioral studies of the time course of motion aftereffects typically reveal a gradual process of adaptation spanning a period of multiple seconds. In contrast, neurophysiological studies have documented multiple motion adaptation effects operating over similar, or substantially faster (i.e., sub-second) time scales. Here we investigated motion adaptation by measuring time-dependent changes in the ability of moving stimuli to distort the perceived position of briefly presented static objects. The temporal dynamics of these motion-induced spatial distortions reveal the operation of two dissociable mechanisms of motion adaptation with differing properties. The first is rapid (subsecond), acts to limit the distortions induced by continuing motion, but is not sufficient to produce an aftereffect once the motion signal disappears. The second gradually accumulates over a period of seconds, does not modulate the size of distortions produced by continuing motion, and produces repulsive aftereffects after motion offset. These results provide new psychophysical evidence for the operation of multiple mechanisms of motion adaptation operating over distinct time scales.


Experimental Brain Research | 2008

Distortions of perceived auditory and visual space following adaptation to motion

Ross W. Deas; Neil W. Roach; Paul V. McGraw

Adaptation to visual motion can induce marked distortions of the perceived spatial location of subsequently viewed stationary objects. These positional shifts are direction specific and exhibit tuning for the speed of the adapting stimulus. In this study, we sought to establish whether comparable motion-induced distortions of space can be induced in the auditory domain. Using individually measured head related transfer functions (HRTFs) we created auditory stimuli that moved either leftward or rightward in the horizontal plane. Participants adapted to unidirectional auditory motion presented at a range of speeds and then judged the spatial location of a brief stationary test stimulus. All participants displayed direction-dependent and speed-tuned shifts in perceived auditory position relative to a ‘no adaptation’ baseline measure. To permit direct comparison between effects in different sensory domains, measurements of visual motion-induced distortions of perceived position were also made using stimuli equated in positional sensitivity for each participant. Both the overall magnitude of the observed positional shifts, and the nature of their tuning with respect to adaptor speed were similar in each case. A third experiment was carried out where participants adapted to visual motion prior to making auditory position judgements. Similar to the previous experiments, shifts in the direction opposite to that of the adapting motion were observed. These results add to a growing body of evidence suggesting that the neural mechanisms that encode visual and auditory motion are more similar than previously thought.

Collaboration


Dive into the Neil W. Roach's collaboration.

Top Co-Authors

Avatar

Paul V. McGraw

University of Nottingham

View shared research outputs
Top Co-Authors

Avatar

Ben S. Webb

University of Nottingham

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

James Heron

University of Bradford

View shared research outputs
Top Co-Authors

Avatar

John H. Hogben

University of Western Australia

View shared research outputs
Top Co-Authors

Avatar

Chris Scholes

University of Nottingham

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge