Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Andrew C. Beall is active.

Publication


Featured researches published by Andrew C. Beall.


Psychological Inquiry | 2002

TARGET ARTICLE: Immersive Virtual Environment Technology as a Methodological Tool for Social Psychology

Jim Blascovich; Jack M. Loomis; Andrew C. Beall; Kimberly R. Swinth; Crystal L. Hoyt; Jeremy N. Bailenson

Historically, at least 3 methodological problems have dogged experimental social psychology: the experimental control-mundane realism trade-off, lack of replication, and unrepresentative sampling. We argue that immersive virtual environment technology (IVET) can help ameliorate, if not solve, these methodological problems and, thus, holds promise as a new social psychological research tool. In this article, we first present an overview of IVET and review IVET-based research within psychology and other fields. Next, we propose a general model of social influence within immersive virtual environments and present some preliminary findings regarding its utility for social psychology. Finally, we present a new paradigm for experimental social psychology that may enable researchers to unravel the very fabric of social interaction.


Behavior Research Methods Instruments & Computers | 1999

Immersive virtual environment technology as a basic research tool in psychology

Jack M. Loomis; Jim Blascovich; Andrew C. Beall

Immersive virtual environment (IVE) technology has great promise as a tool for basic experimental research in psychology. IVE technology gives participants the experience of being surrounded by the computer-synthesized environment. We begin with a discussion of the various devices needed to implement immersive virtual environments, including object manipulation and social interaction. We review the benefits and drawbacks associated with virtual environment technology, in comparison with more conventional ways of doing basic experimental research. We then consider a variety of examples of research using IVE technology in the areas of perception, spatial cognition, and social interaction.


Personality and Social Psychology Bulletin | 2003

Interpersonal Distance in Immersive Virtual Environments

Jeremy N. Bailenson; Jim Blascovich; Andrew C. Beall; Jack M. Loomis

Digital immersive virtual environment technology (IVET) enables behavioral scientists to conduct ecologically realistic experiments with near-perfect experimental control. The authors employed IVET to study the interpersonal distance maintained between participants and virtual humans. In Study 1, participants traversed a three-dimensional virtual room in which a virtual human stood. In Study 2, a virtual human approached participants. In both studies, participant gender, virtual human gender, virtual human gaze behavior, and whether virtual humans were allegedly controlled by humans (i.e., avatars) or computers (i.e., agents) were varied. Results indicated that participants maintained greater distance from virtual humans when approaching their fronts compared to their backs. In addition, participants gave more personal space to virtual agents who engaged them in mutual gaze. Moreover, when virtual humans invaded their personal space, participants moved farthest from virtual human agents. The advantages and disadvantages of IVET for the study of human behavior are discussed.


Psychological Science | 1998

Spatial Updating of Self-Position and Orientation During Real, Imagined, and Virtual Locomotion

Roberta L. Klatzky; Jack M. Loomis; Andrew C. Beall; Sarah S. Chance; Reginald G. Golledge

Two studies investigated updating of self-position and heading during real, imagined, and simulated locomotion. Subjects were exposed to a two-segment path with a turn between segments; they responded by turning to face the origin as they would if they had walked the path and were at the end of the second segment. The conditions of pathway exposure included physical walking, imagined walking from a verbal description, watching another person walk, and experiencing optic flow that simulated walking, with or without a physical turn between the path segments. If subjects failed to update an internal representation of heading, but did encode the pathway trajectory, they should have overturned by the magnitude of the turn between the path segments. Such systematic overturning was found in the description and watching conditions, but not with physical walking. Simulated optic flow was not by itself sufficient to induce spatial updating that supported correct turn responses.


Presence: Teleoperators & Virtual Environments | 1998

Locomotion Mode Affects the Updating of Objects Encountered During Travel: The Contribution of Vestibular and Proprioceptive Inputs to Path Integration

Sarah S. Chance; Florence Gaunet; Andrew C. Beall; Jack M. Loomis

In two experiments, subjects traveled through virtual mazes, encountering target objects along the way. Their task was to indicate the direction to these target objects from a terminal location in the maze (from which the objects could no longer be seen). Subjects controlled their motion through the mazes using three locomotion modes. In the Walk mode, subjects walked normally in the experimental room. For each subject, body position and heading were tracked, and the tracking information was used to continuously update the visual imagery presented to the subjects on a head-mounted display. This process created the impression of immersion in the experimental maze. In the Visual Turn mode subjects moved through the environment using a joystick to control their turning. The only sensory information subjects received about rotation and translation was that provided by the computer-generated imagery. The Real Turn mode was midway between the other two modes, in that subjects physically turned in place to steer while translating in the virtual maze; thus translation through the maze was signaled only by the computer-generated imagery, whereas rotations were signaled by the imagery as well as by proprioceptive and vestibular information. The dependent measure in the experiment was the absolute error of the subjects directional estimate to each target from the terminal location. Performance in the Walk mode was significantly better than in the Visual Turn mode but other trends were not significant. A secondary finding was that the degree of motion sickness depended upon locomotion mode, with the lowest incidence occurring in the Walk mode. Both findings suggest the advisability of having subjects explore virtual environments using real rotations and translations in tasks involving spatial orientation.


Presence: Teleoperators & Virtual Environments | 2004

Does the quality of the computer graphics matter when judging distances in visually immersive environments

William B. Thompson; Peter Willemsen; Amy Ashurst Gooch; Sarah H. Creem-Regehr; Jack M. Loomis; Andrew C. Beall

In the real world, people are quite accurate in judging distances to locations in the environment, at least for targets resting on the ground plane and distances out to about 20 m. Distance judgments in visually immersive environments are much less accurate. Several studies have now shown that in visually immersive environments, the world appears significantly smaller than intended. This study investigates whether or not the compression in apparent distances is the result of the low-quality computer graphics utilized in previous investigations. Visually directed triangulated walking was used to assess distance judgments in the real world and in three virtual environments with graphical renderings of varying quality.


Presence: Teleoperators & Virtual Environments | 2001

Equilibrium Theory Revisited: Mutual Gaze and Personal Space in Virtual Environments

Jeremy N. Bailenson; Jim Blascovich; Andrew C. Beall; Jack M. Loomis

During the last half of the twentieth century, psychologists and anthropologists have studied proxemics, or spacing behavior, among people in many contexts. As we enter the twenty-first century, immersive virtual environment technology promises new experimental venues in which researchers can study proxemics. Immersive virtual environments provide realistic and compelling experimental settings without sacrificing experimental control. The experiment reported here tested Argyle and Deans (1965) equilibrium theorys specification of an inverse relationship between mutual gaze, a nonverbal cue signaling intimacy, and interpersonal distance. Participants were immersed in a three-dimensional virtual room in which a virtual human representation (that is, an embodied agent) stood. Under the guise of a memory task, participants walked towards and around the agent. Distance between the participant and agent was tracked automatically via our immersive virtual environment system. All participants maintained more space around agents than they did around similarly sized and shaped but nonhuman-like objects. Female participants maintained more interpersonal distance between themselves and agents who engaged them in eye contact (that is, mutual gaze behavior) than between themselves and agents who did not engage them in eye contact, whereas male participants did not. Implications are discussed for the study of proxemics via immersive virtual environment technology, as well as the design of virtual environments and virtual humans.


The Journal of the Learning Sciences | 2008

The Use of Immersive Virtual Reality in the Learning Sciences: Digital Transformations of Teachers, Students, and Social Context

Jeremy N. Bailenson; Nick Yee; Jim Blascovich; Andrew C. Beall; Nicole Lundblad; Michael Jin

This article illustrates the utility of using virtual environments to transform social interaction via behavior and context, with the goal of improving learning in digital environments. We first describe the technology and theories behind virtual environments and then report data from 4 empirical studies. In Experiment 1, we demonstrated that teachers with augmented social perception (i.e., receiving visual warnings alerting them to students not receiving enough teacher eye gaze) were able to spread their attention more equally among students than teachers without augmented perception. In Experiments 2 and 3, we demonstrated that by breaking the rules of spatial proximity that exist in physical space, students can learn more by being in the center of the teachers field of view (compared to the periphery) and by being closer to the teacher (compared to farther away). In Experiment 4, we demonstrated that inserting virtual co-learners who were either model students or distracting students changed the learning abilities of experiment participants who conformed to the virtual co-learners. Results suggest that virtual environments will have a unique ability to alter the social dynamics of learning environments via transformed social interaction. We would like to thank Roy Pea, Byron Reeves, and the Stanford LIFE lab for helpful suggestions and Sandra Okita and Dan Schwartz for suggestions as well as for detailed comments on an earlier draft of this article. This work was supported in part by National Science Foundation Grant 0527377.


Presence: Teleoperators & Virtual Environments | 2004

Transformed Social Interaction: Decoupling Representation from Behavior and Form in Collaborative Virtual Environments

Jeremy N. Bailenson; Andrew C. Beall; Jack M. Loomis; Jim Blascovich; Matthew Turk

Computer-mediated communication systems known as collaborative virtual environments (CVEs) allow geographically separated individuals to interact verbally and nonverbally in a shared virtual space in real time. We discuss a CVE-based research paradigm that transforms (i.e., filters and modifies) nonverbal behaviors during social interaction. Because the technology underlying CVEs allows a strategic decoupling of rendered behavior from the actual behavior of the interactants, conceptual and perceptual constraints inherent in face-to-face interaction need not apply. Decoupling algorithms can enhance or degrade facets of nonverbal behavior within CVEs, such that interactants can reap the benefits of nonverbal enhancement or suffer nonverbal degradation. Concepts underlying transformed social interaction (TSI), the ethics and implications of such a research paradigm, and data from a pilot study examining TSI are discussed.


Attention Perception & Psychophysics | 1997

Visually perceived location is an invariant in the control of action

John W. Philbeck; Jack M. Loomis; Andrew C. Beall

We provide experimental evidence that perceived location is an invariant in the control of action, by showing that different actions are directed toward a single visually specified location in space (corresponding to the putative perceived location) and that this single location, although specified by a fixed physical target, varies with the availability of information about the distance of that target. Observers in two conditions varying in the availability of egocentric distance cues viewed targets at 1.5, 3.1, or 6.0 m and then attempted to walk to the target with eyes closed using one of three paths; the path was not specified until after vision was occluded. The observers stopped at about the same location regardless of the path taken, providing evidence that action was being controlled by some invariant, ostensibly visually perceived location. That it was indeed perceived location was indicated by the manipulation of information about target distance—the trajectories in the full-cues condition converged near the physical target locations, whereas those in the reduced-cues condition converged at locations consistent with the usual perceptual errors found when distance cues are impoverished.

Collaboration


Dive into the Andrew C. Beall's collaboration.

Top Co-Authors

Avatar

Jack M. Loomis

University of California

View shared research outputs
Top Co-Authors

Avatar

Jim Blascovich

University of California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Charles M. Oman

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Matthew Turk

University of California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

John W. Philbeck

George Washington University

View shared research outputs
Top Co-Authors

Avatar

Alan Natapoff

Massachusetts Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge