Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Randolph D. Easton is active.

Publication


Featured researches published by Randolph D. Easton.


Journal of Experimental Psychology: Learning, Memory and Cognition | 1995

Object-array structure, frames of reference, and retrieval of spatial knowledge.

Randolph D. Easton; Sholl Mj

Experiments are reported that assessed the ability of people, without vision, to locate the positions of objects from imagined points of observation that are related to their actual position by rotational or translational components. Theoretical issues addressed were whether spatial relations stored in an object-to-object system are directly retrieved or whether retrieval is mediated by a body-centered coordinate system, and whether body-centered access involves a process of imaging updating of self-position. The results, with those of Rieser (1989), indicate that in the case of regularly structured object arrays, interobject relations are directly retrieved for the translation task, but for the rotation task, retrieval occurs by means of a body-centered coordinate system, requiring imagined body rotation. For irregularly structured arrays, access of interobject spatial structure occurs by means of a body-centered coordinate system for both translation and rotation tasks, requiring imagined body translation or rotation. Array regularity affected retrieval of spatial structure in terms of global shape of interobject relations and local object position within global shape.


Journal of Experimental Psychology: Learning, Memory and Cognition | 1997

Do vision and haptics share common representations? Implicit and explicit memory within and between modalities.

Randolph D. Easton; Kavitha Srinivas; Anthony J. Greene

Previous assessments of verbal cross-modal priming have typically been conducted with the visual and auditory modalities. Within-modal priming is always found to be substantially larger than cross-modal priming, a finding that could reflect modality modularity, or alternatively, differences between the coding of visual and auditory verbal information (i.e., geometric vs. phonological). The present experiments assessed implicit and explicit memory within and between vision and haptics, where verbal information could be coded in geometric terms. Because haptic perception of words is sequential or letter-by-letter, experiments were also conducted to isolate the effects of simultaneous versus sequential processing from the manipulation of modality. Together, the results reveal no effects of modality change on implicit or explicit tests. The authors discuss representational similarities between vision and haptics as well as image mediation as possible explanations for the results.


Psychonomic Bulletin & Review | 1997

Transfer between vision and haptics: Memory for 2-D patterns and 3-D objects

Randolph D. Easton; Anthony J. Greene; Kavitha Srinivas

Explicit memory tests such as recognition typically access semantic, modality-independent representations, while perceptual implicit memory tests typically access presemantic, modality-specific representations. By demonstrating comparable cross- and within-modal priming using vision and haptics with verbal materials (Easton, Srinivas, & Greene, 1997), we recently questioned whether the representations underlying perceptual implicit tests were modality specific. Unlike vision and audition, with vision and haptics verbal information can be presented in geometric terms to both modalities. The present experiments extend this line of research by assessing implicit and explicit memory within and between vision and haptics in the nonverbal domain, using both 2-D patterns and 3-D objects. Implicit test results revealed robust cross-modal priming for both 2-D patterns and 3-D objects, indicating that vision and haptics shared abstract representations of object shape and structure. Explicit test results for 3-D objects revealed modality specificity, indicating that the recognition system keeps track of the modality through which an object is experienced.


Attention Perception & Psychophysics | 1996

Haptic cues for orientation and postural control in sighted and blind individuals

John J. Jeka; Randolph D. Easton; Billie Louise Bentzen; James R. Lackner

Haptic cues from fingertip contact with a stable surface attenuate body sway in subjects even when the contact forces are too small to provide physical support of the body. We investigated how haptic cues derived from contact of a cane with a stationary surface at low force levels aids postural control in sighted and congenitally blind individuals. Five sighted (eyes closed) and five congenitally blind subjects maintained a tandem Romberg stance in five conditions: (1) no cane; (2, 3) touch contact (<2 N of applied force) while holding the cane in a vertical or slanted orientation; and (4, 5) force contact (as much force as desired) in the vertical and slanted orientations. Touch contact of a cane at force levels below those necessary to provide significant physical stabilization was as effective as force contact in reducing postural sway in all subjects, compared to the no-cane condition. A slanted cane was far more effective in reducing postural sway than was a perpendicular cane. Cane use also decreased head displacement of sighted subjects far more than that of blind subjects. These results suggest that head movement control is linked to postural control through gaze stabilization reflexes in sighted subjects; such reflexes are absent in congenitally blind individuals and may account for their higher levels of head displacement.


Experimental Brain Research | 1998

Auditory cues for orientation and postural control in sighted and congenitally blind people

Randolph D. Easton; Anthony J. Greene; Paul DiZio; James R. Lackner

Abstract This study assessed whether stationary auditory information could affect body and head sway (as does visual and haptic information) in sighted and congenitally blind people. Two speakers, one placed adjacent to each ear, significantly stabilized center-of-foot-pressure sway in a tandem Romberg stance, while neither a single speaker in front of subjects nor a head-mounted sonar device reduced center-of-pressure sway. Center-of-pressure sway was reduced to the same level in the two-speaker condition for sighted and blind subjects. Both groups also evidenced reduced head sway in the two-speaker condition, although blind subjects’ head sway was significantly larger than that of sighted subjects. The advantage of the two-speaker condition was probably attributable to the nature of distance compared with directional auditory information. The results rule out a deficit model of spatial hearing in blind people and are consistent with one version of a compensation model. Analysis of maximum cross-correlations between center-of-pressure and head sway, and associated time lags suggest that blind and sighted people may use different sensorimotor strategies to achieve stability.


Attention Perception & Psychophysics | 1982

Perceptual dominance during lipreading

Randolph D. Easton; Marylu Basala

Two experiments were pedormed under visual-only and visual-auditory discrepancy conditions (dubs) to assess observers’ abilities to read speech information on a face. In the first experiment, identification and multiple choice testing were used. In addition, the relation between visual and auditory phonetic information was manipulated and related to perceptual bias. In the second experiment, the “compellingness” of the visual-auditory discrepancy as a single speech event was manipulated. Subjects also rated the confidence they had that their perception of the lipped word was accurate. Results indicated that competing visual information exerted little effect on auditory speech recognition, but visual speech recognition was substantially interfered with when discrepant auditory information was present. The extent of auditory bias was found to be related to the abilities of observers to read speech under nondiscrepancy conditions, the magnitude of the visual-auditory discrepancy, and the compellingheSS of the visual-auditory discrepancy as a single event. Auditory bias during speech was found to be a moderately compelling conscious experience, and not simply a case of confused responding or guessing. Results were discussed in terms of current models of perceptual dominance and related to results from modality discordance during space perception.


International Journal of Clinical and Experimental Hypnosis | 1983

Spontaneous recovery of memory during posthypnotic amnesia.

John F. Kihlstrom; Randolph D. Easton; Ronald E. Shor

Abstract Repeated testing of posthypnotic amnesia indicates that some Ss, initially responsive to the suggestion, show appreciable recovery of memory before the pre-arranged signal is given to cancel the amnesia. Comparison of Ss who received 2 successive memory tests during amnesia with others who received only a single test preceded by a distracting activity indicated that the recovery effect was attributable to the passage of time rather than to prior testing. There were wide individual differences in the extent of recovery, with some Ss maintaining a fairly dense amnesia on the second test. Those Ss who maintained amnesia were more hypnotizable, and showed a denser initial amnesia, than those who breached it. An analysis of subjective reports lent credence to the notion of partial response among some hypnotizable Ss who fail to meet a standard criterion of complete amnesia, and pseudoamnesia among some insusceptible Ss who appear to pass it. Some Ss reported voluntarily engaging in cognitive activity ...


Consciousness and Cognition | 2001

Visual–Auditory Events: Cross-Modal Perceptual Priming and Recognition Memory

Anthony J. Greene; Randolph D. Easton; Lisa S.R. LaShell

Modality specificity in priming is taken as evidence for independent perceptual systems. However, Easton, Greene, and Srinivas (1997) showed that visual and haptic cross-modal priming is comparable in magnitude to within-modal priming. Where appropriate, perceptual systems might share like information. To test this, we assessed priming and recognition for visual and auditory events, within- and across- modalities. On the visual test, auditory study resulted in no priming. On the auditory priming test, visual study resulted in priming that was only marginally less than within-modal priming. The priming results show that visual study facilitates identification on both visual and auditory tests, but auditory study only facilitates performance on the auditory test. For both recognition tests, within-modal recognition exceeded cross-modal recognition. The results have two novel implications for the understanding of perceptual priming: First, we introduce visual and auditory priming for spatio-temporal events as a new priming paradigm chosen for its ecological validity and potential for information exchange. Second, we propose that the asymmetry of the cross-modal priming observed here may reflect the capacity of these perceptual modalities to provide cross-modal constraints on ambiguity. We argue that visual perception might inform and constrain auditory processing, while auditory perception corresponds to too many potential visual events to usefully inform and constrain visual perception.


Journal of General Psychology | 1978

A quantitative confirmation of visual capture of curvature.

Randolph D. Easton; Peter W. Moran

An investigation was performed to quantify the experience of contour curvature formed under sensory discrepancy inspection conditions. N = 80 male and female undergraduates were used. Experimental Ss finger-tracked a horizontal straight edge while viewing limb movements through a curve inducing lens. Control Ss inspected the edge unimodally, either through vision (distorted) or proprioception. Rather than relying on verbal reports all Ss were required to match their impressions of contour shape with an adjustable metal curve. Findings indicated that (a) the mean impression of contour shape derived from discrepant visual and proprioceptive information was curved and did not differ from that derived from distorted visual information, (b) proprioceptive inspection alone resulted in accurate judgements of physical straightness, (c) response modality effects did not emerge, and (d) a manipulation designed to direct Ss attention to the left contour shape and during sensory discrepancy inspection did not affect mean curvature matches but created sizeable variability among matches. A sensory organization versus a selective processing interpretation of visual capture was discussed.


Journal of Experimental Psychology: Human Perception and Performance | 1975

Information Processing Analysis of the Chevreul Pendulum Illusion.

Randolph D. Easton; Ronald E. Shor

An information processing investigation was performed to quantify the Chevreul pendulum effect: the tendency of a small pendulum, when suspended from the hand and imaginatively concentrated on, to oscillate seemingly of its own accord. Using a time exposure photographic measurement technique, electronically automated visual and auditory imaginal prompts were presented to the subject during imaginal processing tasks. It was found that the pendulum effect was enhanced when vision of actual pendulum oscillations was permitted and visual or auditory spatially oscillating stimuli were present. Visual spatially oscillating stimuli were superior to their auditory counterparts. Results were discussed in terms of ideomotor and visual capture interpretations of signal and imaginal processing.

Collaboration


Dive into the Randolph D. Easton's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ronald E. Shor

University of New Hampshire

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Donna D. Pistole

University of New Hampshire

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge