Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Linda E. Sibert is active.

Publication


Featured researches published by Linda E. Sibert.


human factors in computing systems | 2000

Evaluation of eye gaze interaction

Linda E. Sibert; Robert J. K. Jacob

Eye gaze interaction can provide a convenient and natural addition to user-computer dialogues. We have previously reported on our interaction techniques using eye gaze [10]. While our techniques seemed useful in demonstration, we now investigate their strengths and weaknesses in a controlled setting. In this paper, we present two experiments that compare an interaction technique we developed for object selection based on a where a person is looking with the most commonly used selection method using a mouse. We find that our eye gaze interaction technique is faster than selection with a mouse. The results show that our algorithm, which makes use of knowledge about how the eyes behave, preserves the natural quickness of the eye. Eye gaze interaction is a reasonable addition to computer interaction and is convenient in situations where it is important to use the hands for other tasks. It is particularly beneficial for the larger screen workspaces and virtual environments of the future, and it will become increasingly practical as eye tracker technology matures.


ACM Transactions on Computer-Human Interaction | 1994

Integrality and separability of input devices

Robert J. K. Jacob; Linda E. Sibert; Daniel C. McFarlane; M. Preston Mullen Jr.

Current input device taxonomies and other frameworks typically emphasize the mechanical structure of input devices. We suggest that selecting an appropriate input device for an interactive task requires looking beyond the physical structure of devices to the deeper perceptual structure of the task, the device, and the interrelationship between the perceptual structure of the task and the control properties of the device. We affirm that perception is key to understanding performance of multidimensional input devices on multidimensional tasks. We have therefore extended the theory of processing of percetual structure to graphical interactive tasks and to the control structure of input devices. This allows us to predict task and device combinations that lead to better performance and hypothesize that performance is improved when the perceptual structure of the task matches the control structure of the device. We conducted an experiment in which subjects performed two tasks with different perceptual structures, using two input devices with correspondingly different control structures, a three-dimensional tracker and a mouse. We analyzed both speed and accuracy, as well as the trajectories generated by subjects as they used the unconstrained three-dimensional tracker to perform each task. The result support our hypothesis and confirm the importance of matching the perceptual structure of the task and the control structure of the input device.


Presence: Teleoperators & Virtual Environments | 1999

Virtual Locomotion: Walking in Place through Virtual Environments

James N. Templeman; Patricia S. Denbrook; Linda E. Sibert

This paper presents both an analysis of requirements for user control over simulated locomotion and a new control technique designed to meet these requirements. The goal is to allow the user to move through virtual environments in as similar a manner as possible to walking through the real world. We approach this problem by examining the interrelationships between motion control and the other actions people use to act, sense, and react to their environment. If the interactions between control actions and sensory feedback can be made comparable to those of actions in the real world, then there is hope for constructing an effective new technique. Candidate solutions are reviewed once the analysis is developed. This analysis leads to a promising new design for a sensor-based virtual locomotion called Gaiter. The new control allows users to direct their movement through virtual environments by stepping in place. The movement of a persons legs is sensed, and in-place walking is treated as a gesture indicating the user intends to take a virtual step. More specifically, the movement of the users legs determines the direction, extent, and timing of their movement through virtual environments. Tying virtual locomotion to leg motion allows a person to step in any direction and control the stride length and cadence of his virtual steps. The user can walk straight, turn in place, and turn while advancing. Motion is expressed in a body-centric coordinate system similar to that of actual stepping. The system can discriminate between gestural and actual steps, so both types of steps can be intermixed.


human factors in computing systems | 1992

The perceptual structure of multidimensional input device selection

Robert J. K. Jacob; Linda E. Sibert

Concepts such as the logical device, taxonomies, and other descriptive frameworks have improved understanding of input devices but ignored or else treated informally their pragmatic qualities, which are fundamental to selection of input devices for tasks. We seek the greater leverage of a predictive theoretical framework by basing our investigation of three-dimensional vs. two-dimensional input devices on Garners theory of processing of perceptual structure in multidimensional tasks. Two three-dimensional tasks may seem equivalent, but if they involve different types of perceptual spaces, they should be assigned correspondingly different input devices. Our experiment supports this hypothesis and thus both indicates when to use three-dimensional input devices and gives credence to our theoretical basis for this indication.


ieee virtual reality conference | 1997

Virtual environments for shipboard firefighting training

David L. Tate; Linda E. Sibert; Tony King

A virtual environment (VE) of portions of the ex-USS Shadwell, the Navys full-scale fire research and test ship, has been developed to study the feasibility of using immersive VE as a tool for shipboard firefighting training and mission rehearsal. The VE system uses a head-mounted display and 3D joystick to allow users to navigate through and interact with the environment. Fire and smoke effects are added to simulate actual firefighting conditions. This paper describes the feasibility tests that were performed aboard the Shadwell and presents promising results of the benefits of VE training over conventional training methods.


IEEE Computer Graphics and Applications | 1997

Using virtual environments to train firefighters

David L. Tate; Linda E. Sibert; Tony King

Using virtual environments for training and mission rehearsal gives US Navy firefighters an edge in fighting real fires. A test run on the ex-USS Shadwell measured the improvement. The results suggest that virtual environments serve effectively for training and mission rehearsal for shipboard firefighting. VE training provides a flexible environment where a firefighter can not only learn an unfamiliar part of the ship, but also practice tactics and procedures for fighting a fire by interacting with simulated smoke and fire without risking lives or property. These tests proved a successful first step in developing a new training technology for shipboard firefighting based on immersive virtual environments. The tests also indicated potential areas for improvement, requiring additional research. User interaction techniques for manipulating objects in VEs need further study, along with usability studies to determine their effectiveness or utility. Other areas that could enhance VE training systems include more natural and intuitive I/O devices such as 3D sound, speech and natural language input, integrated multimedia and hypermedia instruction, and multi user interaction.


ieee virtual reality conference | 2007

Pointman - A New Control for Simulating Tactical Infantry Movements

James N. Templeman; Linda E. Sibert; Robert C. Page; Patricia S. Denbrook

Pointmantrade is a new virtual locomotion control that uses a conventional dual joystick gamepad combined with a tracked head-mounted display and sliding foot pedals. Unlike conventional gamepad control mappings for first person shooter games, Pointman allows users to independently specify their avatars direction of movement and heading of the upper body. The motivation is to develop inexpensive virtual infantry training simulators that allow users to execute realistic tactical infantry movements.


symposium on 3d user interfaces | 2007

Pointman - A Device-Based Control for Realistic Tactical Movement

James N. Templeman; Linda E. Sibert; Robert C. Page; Patricia S. Denbrook

Pointmantrade is a new virtual locomotion control that uses a conventional dual joystick gamepad in combination with a tracked head-mounted display and sliding foot pedals. Unlike the control mappings of a conventional gamepad, Pointman allows users to specify their direction of movement independently from the heading of the upper body. The motivation for this work is to develop a virtual infantry training simulator that is inexpensive, portable, and allows the user to execute realistic tactical infantry movements. Tactical movements rely heavily on the ability to scan while moving along a path, which requires the ability to independently coordinate course and heading. Conventional gamepad control mappings confound course and heading, and facilitate moving sideways and spiraling toward or away from targets. Pointman was derived from an analysis of how people move and coordinate actions in the real and virtual worlds


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2008

Comparison of Locomotion Interfaces for Immersive Training Systems

Linda E. Sibert; James N. Templeman; Roy Stripling; Joseph Coyne; Robert C. Page; Zina La Budde; Daniel Afergan

The ability to stay oriented in an environment is an important skill for urban combat. Warriors must systematically clear areas of responsibility while moving tactically, scanning for potential targets, and engaging threats. A locomotion interface for an immersive virtual environment urban combat training system should enable the development of navigational skills. One aspect of navigation is path integration, in which a person estimates current position and orientation along a pathway from velocity and acceleration. Path integration allows people to stay oriented in low visibility or the dark, which is important in many tactical situations. This study compared the performance of three locomotion interfaces on three path integration tasks. Analysis revealed that type of locomotion interface has a significant impact on performance.


IEEE Computer Graphics and Applications | 1996

Shipboard VR: from damage control to design

Lawrence J. Rosenblum; Jim Durbin; Upul Obeysekare; Linda E. Sibert; David L. Tate; James N. Templeman; Jyoti Agrawal; Daniel Fasulo; Thomas Meyer; Greg Newton; Amit Shalev; Tony King

Collaboration


Dive into the Linda E. Sibert's collaboration.

Top Co-Authors

Avatar

James N. Templeman

United States Naval Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Robert C. Page

United States Naval Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

David L. Tate

United States Naval Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Tony King

United States Naval Research Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Amit Shalev

United States Naval Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Daniel Fasulo

United States Naval Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Greg Newton

United States Naval Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Joseph Coyne

United States Naval Research Laboratory

View shared research outputs
Researchain Logo
Decentralizing Knowledge