Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Andrew S. Forsberg is active.

Publication


Featured researches published by Andrew S. Forsberg.


interactive 3d graphics and games | 1997

Image plane interaction techniques in 3D immersive environments

Jeffrey S. Pierce; Andrew S. Forsberg; Matthew Conway; Seung Hong; Robert C. Zeleznik; Mark R. Mine

This paper presents a set of interaction techniques for use in headtracked immersive virtual environments. With these techniques, the user interacts with the 2D projections that 3D objects in the scene make on his image plane. The desktop analog is the use of a mouse to interact with objects in a 3D scene based on their projections on the monitor screen. Participants in an immersive environment can use the techniques we discuss for object selection, object manipulation, and user navigation in virtual environments. CR Categories and Subject Descriptors: 1.3.6 [Computer Graphics]: Methodology and Techniques - InteractionTechniques; 1.3.7 [Computer Graphics]: Three-Dimensional Graphics and Realism - VirtualReality. Additional Keywords: virtual worlds, virtual environments, navigation, selection, manipulation.


interactive 3d graphics and games | 1997

Two pointer input for 3D interaction

Robert C. Zeleznik; Andrew S. Forsberg; Paul S. Strauss

We explore a range of techniques that use two hands to control two independent cursors to perform operations in 3D desktop applications. Based on research results in 2D applications, we believe that two-handed input provides the potential for creating more efficient and more fluid interfaces, especially for tasks that are context-sensitive or that have many degrees of freedom. These tasks appear frequently in 3D applications and are commonly broken down into a series of sequential operations, each controlling fewer degrees of freedom – even though this may dramatically change the character of the task. However, two-handed interaction, in theory, makes it possible to perform the same tasks using half the number of sequential steps since two previously sequential operations can be performed simultaneously. In addition, many forms of two-handed interaction may be simpler to use and to understand since they correspond to common interactions in the physicat world. It is significant when tasks that need to be broken down into two sequential single-cursor steps, can be performed as a single fluid operation using two cursors. CR Categories and Subjeet Descriptors: 1.3.6 [Computer Graphics]: Methodology and Techniques Interaction Techniques; 1.3.1 [Computer Graphics]: Hardware Architecture Input Devices. Additional


user interface software and technology | 1996

Aperture based selection for immersive virtual environments

Andrew S. Forsberg; Kenneth P. Herndon; Robert C. Zeleznik

We present two novel techniques for effectively selecting objects in immersive virtual environments using a single 6 DOF magnetic tracker. These techniques advance the state of the art in that they exploit the participant’s visual frame of reference and fully utilize the position and orientation data from the tracker to improve accuracy of the selection task. Preliminary results from pilot usability studies validate our designs. Finally, the two techniques combine to compensate for each other’s weaknesses.


user interface software and technology | 1998

The music notepad

Andrew S. Forsberg; Mark Dieterich; Robert C. Zeleznik

We present a system for entering common music notation based on 2D gestural input. The key feature of the system is the look-and-feel of the interface which approximates sketching music with paper and pencil. A probability-based interpreter integrates sequences of gestural input to perform the most common notation and editing operations. In this paper, we present the user’s model of the system, the components of the high-level recognition system, and a discussion of the evolution of the system including user feedback.


interactive 3d graphics and games | 1999

UniCam—2D gestural camera controls for 3D environments

Robert C. Zeleznik; Andrew S. Forsberg

We present a novel approach to controlling a virtual 3D camera with a 2D mouse or stylus input device that is based on gestural interaction. Our approach to 3D camera manipulation, UniCam, requires only a single-button stylus or mouse to directly invoke specific camera operations within a single 3D view. No 2D widgets or keyboard modifiers are necessary. By gesturally invoking all camera functionality, UniCam reduces the modal nature of typical desktop 3D graphics applications, since remaining mouse or stylus buttons can be used for other application functionality. In addition, the particular choice and refinement of UniCam controls efficiently enables a wide range of camera manipulation tasks, including translation in three-dimensions, orbiting about a point, animated focusing on an object surface, animated navigation around an object, zooming in upon a region, and saving and restoring viewpoints. UniCam’s efficiency derives primarily from improved transitions between techniques and to a lesser degree from adaptations of existing techniques. Although we have conducted no formal user evaluations, UniCam is the result of years of iterative development with approximately one hundred users. During this time more conventional modifier key and menu-based implementations were implemented and either refined or rejected. Informally, users appear to require a few hours of training to become proficient with UniCam controls, but then typically choose them almost exclusively even when widget-based controls, such as those provided by an Inventor viewer, are simultaneously available. CR Categories: H.5.2 [Information Interfaces and Presentation (e.g., HCI)]: User Interfaces—Graphics user interfaces (GUI); H.5.2 [Information Interfaces and Presentation (e.g., HCI)]: User Interfaces—Input devices and strategies; H.5.1 [Information Interfaces and Presentation (e.g., HCI)]: Multimedia Information Systems—Artificial, augmented, and virtual realities; I.3.6 [Computer Graphics]: Methodology and Techniques—Interaction techniques;


IEEE Transactions on Visualization and Computer Graphics | 2008

A Comparative Study of Desktop, Fishtank, and Cave Systems for the Exploration of Volume Rendered Confocal Data Sets

Prabhat; Andrew S. Forsberg; Michael Katzourin; Kristi A. Wharton; Mel Slater

We present a participant study that compares biological data exploration tasks using volume renderings of laser confocal microscopy data across three environments that vary in level of immersion: a desktop, fishtank, and cave system. For the tasks, data, and visualization approach used in our study, we found that subjects qualitatively preferred and quantitatively performed better in the cave compared with the fishtank and desktop. Subjects performed real-world biological data analysis tasks that emphasized understanding spatial relationships including characterizing the general features in a volume, identifying colocated features, and reporting geometric relationships such as whether clusters of cells were coplanar. After analyzing data in each environment, subjects were asked to choose which environment they wanted to analyze additional data sets in - subjects uniformly selected the cave environment.


IEEE Computer Graphics and Applications | 1997

Seamless interaction in virtual reality

Andrew S. Forsberg; Joseph J. LaViola; Lee Markosian; Robert C. Zeleznik

Jot is a novel research interface for virtual reality modeling. This system seamlessly integrates and applies a variety of virtual and physical tools, each customized for specific tasks. The Jot interface not only moves smoothly from one tool to another but also physically and cognitively matches individual tools to the tasks they perform. In particular, we exploit the notion that gestural interaction is more direct, in many cases, than traditional widget based interaction. We also respect the time tested observation that some operations-even conceptually three dimensional ones-are better performed with 1D or 2D input devices, whereas other operations are more naturally performed using stereoscopic views, higher DOF input devices, or both. Ultimately we strive for a 3D modeling system with an interface as transparent as the interaction afforded by a pencil and a sheet of paper. For example, the system should facilitate the tasks of drawing and erasing and provide an easy transition between the two. Jot emerged from our previous work on a mouse based system, called Sketch, for gesturally creating imprecise 3D models. Jot extends Sketchs functionality to a wider spectrum of modeling, from concept design to detailed feature based parametric parts. Jot also extends the interaction in Sketch to better support individual modeling tasks. We extended Sketchs gestural framework to integrate interface components ranging from traditional desktop interface widgets to context sensitive gestures to direct manipulation techniques originally designed for immersive VR.


ieee visualization | 2000

Immersive virtual reality for visualizing flow through an artery

Andrew S. Forsberg; David H. Laidlaw; Andries van Dam; Robert M. Kirby; George Em Karniadakis; Jonathan L. Elion

We present an immersive system for exploring numerically simulated flow data through a model of a coronary artery graft. This tightly-coupled interdisciplinary project is aimed at understanding how to reduce the failure rate of these grafts. The visualization system provides a mechanism for exploring the effect of changes to the geometry, to the flow, and for exploring potential sources of future lesions. The system uses gestural and voice interactions exclusively, moving away from more traditional windows/icons/menus/point-and-click (WIMP) interfaces. We present an example session using the system and discuss our experiences developing, testing, and using it. We describe some of the interaction and rendering techniques that we experimented with and describe their level of success. Our experience suggests that systems like this are exciting to clinical researchers, but conclusive evidence of their value is not yet available.


symposium on 3d user interfaces | 2008

Poster: Effects of Head Tracking and Stereo on Non-Isomorphic 3D Rotation

Joseph J. LaViola; Andrew S. Forsberg; John Huffman; Andrew Bragdon

We present an experimental study that explores how head tracking and stereo affect user performance when rotating 3D virtual objects using isomorphic and non-isomorphic rotation techniques. Our experiment compares isomorphic with non-isomorphic rotation utilizing four different display modes (no head tracking/no stereo, head tracking/no stereo, no head tracking/stereo, and head tracking/stereo) and two different angular error thresholds for task completion. Our results indicate that rotation error is significantly reduced when subjects perform the task using non-isomorphic 3D rotation with head tracking/stereo than with no head tracking/no stereo. In addition, subjects peformed the rotation task with significantly less error with head tracking/stereo and no head tracking/stereo than with no head tracking/no stereo, regardless of rotation technique. Subjects also highly rated the importance of stereo and non-isomorphic amplification in the 3D rotation task.


sketch based interfaces and modeling | 2008

An empirical study in pen-centric user interfaces: diagramming

Andrew S. Forsberg; Andrew Bragdon; Joseph J. LaViola; Sashi Raghupathy; Robert C. Zeleznik

We present a user study aimed at helping understand the applicability of pen-computing in desktop environments. The study applied three mouse-and-keyboard-based and three pen-based interaction techniques to six variations of a diagramming task. We ran 18 subjects from a general population and the key finding was that while the mouse and keyboard techniques generally were comparable or faster than the pen techniques, subjects ranked pen techniques higher and enjoyed them more. Our contribution is the results from a formal user study that suggests there is a broader applicability and subjective preference for pen user interfaces than the niche PDA and mobile market they currently serve.

Collaboration


Dive into the Andrew S. Forsberg's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Joseph J. LaViola

University of Central Florida

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge