Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Laurence R. Harris is active.

Publication


Featured researches published by Laurence R. Harris.


Experimental Brain Research | 2000

Visual and non-visual cues in the perception of linear self-motion.

Laurence R. Harris; Michael Jenkin; Daniel C. Zikovitz

Abstract. Surprisingly little is known of the perceptual consequences of visual or vestibular stimulation in updating our perceived position in space as we move around. We assessed the roles of visual and vestibular cues in determining the perceived distance of passive, linear self motion. Subjects were given cues to constant-acceleration motion: either optic flow presented in a virtual reality display, physical motion in the dark or combinations of visual and physical motions. Subjects indicated when they perceived they had traversed a distance that had been previously given to them either visually or physically. The perceived distance of motion evoked by optic flow was accurate relative to a previously presented visual target but was perceptually equivalent to about half the physical motion. The perceived distance of physical motion in the dark was accurate relative to a previously presented physical motion but was perceptually equivalent to a much longer visually presented distance. The perceived distance of self motion when both visual and physical cues were present was more closely perceptually equivalent to the physical motion experienced rather than the simultaneous visual motion, even when the target was presented visually. We discuss this dominance of the physical cues in determining the perceived distance of self motion in terms of capture by non-visual cues. These findings are related to emerging studies that show the importance of vestibular input to neural mechanisms that process self motion.


The Journal of Physiology | 1980

The superior colliculus and movements of the head and eyes in cats

Laurence R. Harris

1. The superior colliculus has been studied in alert cats which were restrained and whose head and eye movements were monitored.


ieee virtual reality conference | 2001

Tolerance of temporal delay in virtual environments

Robert S. Allison; Laurence R. Harris; Michael Jenkin; Urszula Jasiobedzka; James E. Zacher

To enhance presence, facilitate sensory motor performance, and avoid disorientation or nausea, virtual-reality applications require the perception of a stable environment. End-end tracking latency (display lag) degrades this illusion of stability and has been identified as a major fault of existing virtual-environment systems. Oscillopsia refers to the perception that the visual world appears to swim about or oscillate in space and is a manifestation of this loss of perceptual stability of the environment. The effects of end-end latency and head velocity on perceptual stability in a virtual environment were investigated psychophysically. Subjects became significantly more likely to report oscillopsia during head movements when end-end latency or head velocity were increased. It is concluded that perceptual instability of the world arises with increased head motion and increased display lag. Oscillopsia is expected to be more apparent in tasks requiring real locomotion or rapid head movement.


Archive | 2001

Vision and Attention

Michael Jenkin; Laurence R. Harris

The term “visual attention” embraces many aspects of vision. It refers to processes that find, pull out and may possibly even help to define, features in the visual environment. All these processes take the form of interactions between the observer and the environment: attention is drawn by some aspects of the visual scene but the observer is critical in defining which aspects are selected.


Experimental Brain Research | 2006

The subjective visual vertical and the perceptual upright

Richard T. Dyde; Michael R. Jenkin; Laurence R. Harris

AbstractThe direction of ‘up’ has traditionally been measured by setting a line (luminous if necessary) to the apparent vertical, a direction known as the ‘subjective visual vertical’ (SVV); however for optimum performance in visual skills including reading and facial recognition, an object must to be seen the ‘right way up’—a separate direction which we have called the ‘perceptual upright’ (PU). In order to measure the PU, we exploited the fact that some symbols rely upon their orientation for recognition. Observers indicated whether the symbol ‘ ’ presented in various orientations was identified as either the letter ‘p’ or the letter ‘d’. The average of the transitions between ‘p-to-d’ and ‘d-to-p’ interpretations was taken as the PU. We have labelled this new experimental technique the Oriented CHAracter Recognition Test (OCHART). The SVV was measured by estimating whether a line was rotated clockwise or counter-clockwise relative to gravity. We measured the PU and SVV while manipulating the orientation of the visual background in different observer postures: upright, right side down and (for the PU) supine. When the body, gravity and the visual background were aligned, the SVV and the PU were similar, but as the background orientation and observer posture orientations diverged, the two measures varied markedly. The SVV was closely aligned with the direction of gravity whereas the PU was closely aligned with the body axis. Both probes showed influences of all three cues (body orientation, vision and gravity) and these influences could be predicted from a weighted vectorial sum of the directions indicated by these cues. For the SVV, the ratio was 0.2:0.1:1.0 for the body, visual and gravity cues, respectively. For the PU, the ratio was 2.6:1.2:1.0. In the case of the PU, these same weighting values were also predicted by a measure of the reliability of each cue; however, reliability did not predict the weightings for the SVV. This is the first time that maximum likelihood estimation has been demonstrated in combining information between different reference frames. The OCHART technique provides a new, simple and readily applicable method for investigating the PU which complements the SVV. Our findings suggest that OCHART is particularly suitable for investigating the functioning of visual and non-visual systems and their contributions to the perceived upright of novel environments such as high- and low-g environments, and in patient and ageing populations, as well as for normal observers.


Vision Research | 2001

Humans can use optic flow to estimate distance of travel

Fara Redlick; Michael Jenkin; Laurence R. Harris

We demonstrate that humans can use optic flow to estimate distance travelled when appropriate scaling information is provided. Eleven subjects were presented with visual targets in a virtual corridor. They were then provided with optic flow compatible with movement along the corridor and asked to indicate when they had reached the previously presented target position. Performance depended on the movement profile: for accelerations above 0.1 m/s2 performance was accurate. Slower optic-flow acceleration resulted in an overestimation of motion which was most pronounced for constant velocity motion when the overestimation reached 170%. The results are discussed in terms of the usual synergy between multiple sensory cues to motion and the factors that might contribute to such a pronounced miscalibration between optic flow and the resulting perception of motion.


Advances in psychology | 1984

Small Saccades to Double-Stepped Targets Moving in Two Dimensions

John M. Findlay; Laurence R. Harris

We studied saccadic eye tracking, concentrating particularly on the situation where a target makes a jump movement in the period whilst a saccade is being prepared to a previous jump. Such a perturbation affects the saccadic system in several ways. The spatial characteristics of the movement, amplitude or direction equally, can be modified on the basis of new information arriving up to 80 msec before the initiation of the saccade. The perturbation rarely produces any substantial effects on the trajectory of the movement itself, but the occasional exceptions reveal the presence of a goal seeking feedback mechanism underlying saccadic production.


Experimental Brain Research | 1987

Vestibular and optokinetic eye movements evoked in the cat by rotation about a tilted axis

Laurence R. Harris

SummaryHorizontal and vertical eye movements were recorded from cats in response to either a) off-vertical axis rotation (OVAR) at a range of velocities (5–72 deg/s) and a range of tilts (0–60 deg) or b) horizontal (with respect to the cat) optokinetic stimulation (10–80 deg/s), also around a range of tilted axes (0–60 deg). The responses to stopping either of these stimuli were also measured: post-rotatory nystagmus (PRN) following actual rotation, and optokinetic after nystagmus (OKAN) following optokinetic stimulation. The response found during OVAR was a nystagmus with a bias slow-phase velocity that was sinusoidally modulated. The bias was dependent on the tilt and reached 50% of its maximum velocity (maximum was 73±23% of the table velocity) at a tilt of 16 deg. The phase of modulation in horizontal eye velocity bore no consistent relation to the angular rotation. The amplitude of this modulation was roughly correlated with the bias with a slope of 0.13 (deg/s) modulation/(deg/s) bias velocity. There was also a low-velocity vertical bias with the slow-phases upwardly directed. The vertical bias was also modulated and the amplitude depended on the bias velocity (0.27 (deg/s) modulation/ (deg/s) bias velocity). When separated from the canal dependent response, the build up of the OVAR response had a time constant of 5.0±0.8 s. Following OVAR there was no decline in the time constant of PRN which remained at the value measured during earth-vertical axis rotation (EVAR) (6.3±2 s). The peak amplitude of PRN was reduced, dependent on the tilt, reaching only 20% of its EVAR value for a tilt of 20 deg. When a measurable PRN was found, it was accompanied by a slowly-emerging vertical component (time constant 5.4±2s) the effect of which was to vector the PRN accurately onto the earth horizontal. OKN measured about a tilted axis showed no differences in magnitude or direction from EVAR OKN even for tilts as large as 60 deg. OKAN following optokinetic stimulation around a tilted axis appeared normal in the horizontal plane (with respect to the animal) but was accompanied by a slowly emerging (time constant 4.1±2 s) vertical component, the effect of which was to vector the overall OKAN response onto the earth horizontal for tilts less than 20 deg. These results are compared with data from monkey and man and discussed in terms of the involvement of the velocity storage mechanism.


Nature | 1976

Visual contrast sensitivity of a 6-month-old infant Measured by the evoked potential

Laurence R. Harris; Janette Atkinson; Oliver Braddick

THE visual contrast sensitivity of the human infant has previously been measured using behavioural methods, in which psychophysical thresholds have been inferred from visual preferences1. In this paper we report a preliminary investigation of contrast sensitivity, as measured by the amplitude of the evoked cortical response, in a 6-month-old infant. These results are compared with behavioural measurements on the same infant, and with adult data obtained psychophysically and from evoked cortical responses.


Experimental Brain Research | 2005

Simultaneity constancy: detecting events with touch and vision.

Vanessa Harrar; Laurence R. Harris

What are the consequences of visual and tactile neural processing time differences when combining multisensory information about an event on the body’s surface? Visual information about such events reaches the brain at a time that is independent of the location of the event. However, tactile information about such events takes different amounts of time to be processed depending on the distance between the stimulated surface and the brain. To investigate the consequences of these differences, we measured reaction times to touches and lights on different parts of the body and the perceived subjective simultaneity (PSS) for various combinations. The PSSs for pairs of stimuli were predicted by the differences in reaction times. When lights and touches were on different body parts (i.e. the hand and foot) a trend towards compensation for any processing time differences was found, such that simultaneity was veridically perceived. When stimuli were both on the foot, subjects perceived simultaneity when the light came on significantly earlier than the touch, despite similar processing times for these stimuli. When the stimuli were both on the hand, however, there was complete compensation for the significant processing time differences between the light and touch such that simultaneity was correctly perceived, a form of simultaneity constancy. To identify if there was a single simultaneity constancy mechanism or multiple parallel mechanisms, we altered the PSS of an auditory-visual stimulus pair and looked for effects on the PSS of a visual-touch pair. After repeated exposure to a light/sound pair with a fixed time lag between them, there was no effect on the PSS of a touch-light pair, suggesting multiple parallel simultaneity constancy mechanisms.

Collaboration


Dive into the Laurence R. Harris's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge