Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Leland S. Stone is active.

Publication


Featured researches published by Leland S. Stone.


Vision Research | 1992

Human speed perception is contrast dependent

Leland S. Stone; Peter Thompson

When two parallel gratings moving at the same speed are presented simultaneously, the lower-contrast grating appears slower. This misperception is evident across a wide range of contrasts (2.5-50%) and does not appear to saturate (e.g. a 50% contrast grating appears slower than a 70% contrast grating moving at the same speed). On average, a 70% contrast grating must be slowed by 35% to match a 10% contrast grating moving at 2 degrees/sec (N = 6). Furthermore, the effect is largely independent of the absolute contrast level and is a quasi-linear function of log contrast ratio. A preliminary parametric study shows that, although spatial frequency has little effect, relative orientation is important. Finally, the misperception of relative speed appears lessened when the stimuli to be matched are presented sequentially.


Vision Research | 1994

A model of self-motion estimation within primate extrastriate visual cortex

John A. Perrone; Leland S. Stone

Perrone [(1992) Journal of the Optical Society of America A, 9, 177-194] recently proposed a template-based model of self-motion estimation which uses direction- and speed-tuned input sensors similar to neurons in area MT of primate visual cortex. Such an approach would generally require an unrealistically large number of templates (five continuous dimensions). However, because primates, including humans, have a number of oculomotor mechanisms which stabilize gaze during locomotion, we can greatly reduce the number of templates required (two continuous dimensions and one compressed and bounded dimension). We therefore refined the model to deal with the gaze-stabilization case and extended it to extract heading and relative depth simultaneously. The new model is consistent with previous human psychophysics and has the emergent property that its output detectors have similar response properties to neurons in area MST.


Trends in Neurosciences | 1999

Tracking with the mind’s eye

Richard J. Krauzlis; Leland S. Stone

The two components of voluntary tracking eye-movements in primates, pursuit and saccades, are generally viewed as relatively independent oculomotor subsystems that move the eyes in different ways using independent visual information. Although saccades have long been known to be guided by visual processes related to perception and cognition, only recently have psychophysical and physiological studies provided compelling evidence that pursuit is also guided by such higher-order visual processes, rather than by the raw retinal stimulus. Pursuit and saccades also do not appear to be entirely independent anatomical systems, but involve overlapping neural mechanisms that might be important for coordinating these two types of eye movement during the tracking of a selected visual object. Given that the recovery of objects from real-world images is inherently ambiguous, guiding both pursuit and saccades with perception could represent an explicit strategy for ensuring that these two motor actions are driven by a single visual interpretation.


Vision Research | 1990

Effect of contrast on the perceived direction of a moving plaid

Leland S. Stone; Andrew B. Watson; Jeffrey B. Mulligan

We performed a series of experiments examining the effect of contrast on the perception of moving plaids. This was done to test the hypothesis put forth by Adelson and Movshon (1982) that the human visual system determines the direction of a moving plaid in a two-staged process: decomposition into component motion followed by application of the intersection of constraints rule. Although there is recent evidence that the first tenet of their hypothesis is correct, i.e. that plaid motion is initially decomposed into the motion of the individual grating components (Movshon, Adelson, Gizzi & Newsome, 1986; Welch, 1989), the nature of the second-stage combination rule has not as yet been established. We found that when the gratings within the plaid are of different contrast, the perceived direction is not predicted by the intersection of constraints rule. There is a strong (up to 20 deg) bias in the direction of the higher-contrast grating. A revised model, which incorporates a contrast-dependent weighting of perceived grating speed as observed for 1-D patterns (Thompson, 1982), can quantitatively predict most of our results. We discuss our results in the context of various models of human visual motion processing and of physiological responses of neurons in the primate visual system.


Journal of Vision | 2003

Shared motion signals for human perceptual decisions and oculomotor actions

Leland S. Stone; Richard J. Krauzlis

A fundamental question in primate neurobiology is to understand to what extent motor behaviors are driven by shared neural signals that also support conscious perception or by independent subconscious neural signals dedicated to motor control. Although it has clearly been established that cortical areas involved in processing visual motion support both perception and smooth pursuit eye movements, it remains unknown whether the same or different sets of neurons within these structures perform these two functions. Examination of the trial-by-trial variation in human perceptual and pursuit responses during a simultaneous psychophysical and oculomotor task reveals that the direction signals for pursuit and perception are not only similar on average but also co-vary on a trial-by-trial basis, even when performance is at or near chance and the decisions are determined largely by neural noise. We conclude that the neural signal encoding the direction of target motion that drives steady-state pursuit and supports concurrent perceptual judgments emanates from a shared ensemble of cortical neurons.


Vision Research | 1997

Human heading estimation during visually simulated curvilinear motion

Leland S. Stone; John A. Perrone

Recent studies have suggested that humans cannot estimate their direction of forward translation (heading) from the resulting retinal motion (flow field) alone when rotation rates are higher than approximately 1 deg/sec. It has been argued that either oculomotor or static depth cues are necessary to disambiguate the rotational and translational components of the flow field and, thus, to support accurate heading estimation. We have re-examined this issue using visually simulated motion along a curved path towards a layout of random points as the stimulus. Our data show that, in this curvilinear motion paradigm, five of six observers could estimate their heading relatively accurately and precisely (error and uncertainty < approximately 4 deg), even for rotation rates as high as 16 deg/sec, without the benefit of either oculomotor or static depth cues signaling rotation rate. Such performance is inconsistent with models of human self-motion estimation that require rotation information from sources other than the flow field to cancel the rotational flow.


Vision Research | 1998

Human motion perception and smooth eye movements show similar directional biases for elongated apertures

Brent R. Beutter; Leland S. Stone

Although numerous studies have examined the relationship between smooth-pursuit eye movements and motion perception, it remains unresolved whether a common motion-processing system subserves both perception and pursuit. To address this question, we simultaneously recorded perceptual direction judgments and the concomitant smooth eye-movement response to a plaid stimulus that we have previously shown generates systematic perceptual errors. We measured the perceptual direction biases psychophysically and the smooth eye-movement direction biases using two methods (standard averaging and oculometric analysis). We found that the perceptual and oculomotor biases were nearly identical, suggesting that pursuit and perception share a critical motion processing stage, perhaps in area MT or MST of extrastriate visual cortex.


Journal of The Optical Society of America A-optics Image Science and Vision | 2003

Saccadic and perceptual performance in visual search tasks. I. Contrast detection and discrimination.

Brent R. Beutter; Miguel P. Eckstein; Leland S. Stone

Humans use saccadic eye movements when they search for visual targets. We investigated the relationship between the visual processing used by saccades and perception during search by comparing saccadic and perceptual decisions under conditions in which each had access to equal visual information. We measured the accuracy of perceptual judgments and of the first search saccade over a wide range of target saliences [signal-to-noise ratios (SNRs)] in both a contrast-detection and a contrast-discrimination task. We found that saccadic and perceptual performances (1) were similar across SNRs, (2) showed similar task-dependent differences, and (3) were well described by a model based on signal detection theory that explicitly includes observer uncertainty [M. P. Eckstein et al., J. Opt. Soc. Am. A 14, 2406 (1997)1]. Our results demonstrate that the accuracy of the first saccade provides much information about the observers perceptual state at the time of the saccadic decision and provide evidence that saccades and perception use similar visual processing mechanisms for contrast detection and discrimination.


Visual Neuroscience | 2000

Motion coherence affects human perception and pursuit similarly

Brent R. Beutter; Leland S. Stone

Pursuit and perception both require accurate information about the motion of objects. Recovering the motion of objects by integrating the motion of their components is a difficult visual task. Successful integration produces coherent global object motion, while a failure to integrate leaves the incoherent local motions of the components unlinked. We compared the ability of perception and pursuit to perform motion integration by measuring direction judgments and the concomitant eye-movement responses to line-figure parallelograms moving behind stationary rectangular apertures. The apertures were constructed such that only the line segments corresponding to the parallelograms sides were visible; thus, recovering global motion required the integration of the local segment motion. We investigated several potential motion-integration rules by using stimuli with different object, vector-average, and line-segment terminator-motion directions. We used an oculometric decision rule to directly compare direction discrimination for pursuit and perception. For visible apertures, the percept was a coherent object, and both the pursuit and perceptual performance were close to the object-motion prediction. For invisible apertures, the percept was incoherently moving segments, and both the pursuit and perceptual performance were close to the terminator-motion prediction. Furthermore, both psychometric and oculometric direction thresholds were much higher for invisible apertures than for visible apertures. We constructed a model in which both perception and pursuit are driven by a shared motion-processing stage, with perception having an additional input from an independent static-processing stage. Model simulations were consistent with our perceptual and oculomotor data. Based on these results, we propose the use of pursuit as an objective and continuous measure of perceptual coherence. Our results support the view that pursuit and perception share a common motion-integration stage, perhaps within areas MT or MST.


Perception | 2000

Visual Motion Integration for Perception and Pursuit

Leland S. Stone; Brent R Beutter; Jean Lorenceau

To examine the relationship between visual motion processing for perception and pursuit, we measured the pursuit eye-movement and perceptual responses to the same complex-motion stimuli. We show that humans can both perceive and pursue the motion of line-figure objects, even when partial occlusion makes the resulting image motion vastly different from the underlying object motion. Our results show that both perception and pursuit can perform largely accurate motion integration, ie the selective combination of local motion signals across the visual field to derive global object motion. Furthermore, because we manipulated perceived motion while keeping image motion identical, the observed parallel changes in perception and pursuit show that the motion signals driving steady-state pursuit and perception are linked. These findings disprove current pursuit models whose control strategy is to minimize retinal image motion, and suggest a new framework for the interplay between visual cortex and cerebellum in visuomotor control.

Collaboration


Dive into the Leland S. Stone's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Li Li

University of Hong Kong

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge