Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Matthew Conway is active.

Publication


Featured researches published by Matthew Conway.


human factors in computing systems | 1995

Virtual reality on a WIM: interactive worlds in miniature

Richard W. Stoakley; Matthew Conway; Randy Pausch

This paper explores a user interface technique which augments an immersive head tracked display with a hand-held miniature copy of the virtual environment. We call this interface technique the Worlds in Miniature (WIM) metaphor. By establishing a direct relationship between life-size objects in the virtual world and miniature objects in the WIM, we can use the WIM as a tool for manipulating objects in the virtual environment. In addition to describing object manipulation, this paper explores ways in which Worlds in Miniature can act as a single unifying metaphor for such application independent interaction techniques as object selection, navigation, path planning, and visualization. The WIM metaphor naturally offers multiple points of view and multiple scales at which the user can operate, all without requiring explicit modes or commands. Informal user observation indicates that users adapt to the Worlds in Miniature metaphor quickly and that physical props are helpful in manipulating the WIM and other objects in the environment.


interactive 3d graphics and games | 1997

Image plane interaction techniques in 3D immersive environments

Jeffrey S. Pierce; Andrew S. Forsberg; Matthew Conway; Seung Hong; Robert C. Zeleznik; Mark R. Mine

This paper presents a set of interaction techniques for use in headtracked immersive virtual environments. With these techniques, the user interacts with the 2D projections that 3D objects in the scene make on his image plane. The desktop analog is the use of a mouse to interact with objects in a 3D scene based on their projections on the monitor screen. Participants in an immersive environment can use the techniques we discuss for object selection, object manipulation, and user navigation in virtual environments. CR Categories and Subject Descriptors: 1.3.6 [Computer Graphics]: Methodology and Techniques - InteractionTechniques; 1.3.7 [Computer Graphics]: Three-Dimensional Graphics and Realism - VirtualReality. Additional Keywords: virtual worlds, virtual environments, navigation, selection, manipulation.


user interface software and technology | 1996

3D magic lenses

John Viega; Matthew Conway; George H. Williams; Randy Pausch

This work extends the metaphor of a see-through interface embodied in Magic LensesTM to 3D environments. We present two new see-through visualization techniques: jlat lenses in 3D and volumetric lenses. We discuss implementation concerns for platforms that have programmer accessible hardware clipping planes and show several examples of each visualization technique. We also examine composition of multiple lenses in 3D environments, which strengthens the flat lens metaphor, but may have no meaningful semantics in the case of volumetric lenses.


human factors in computing systems | 2000

Alice: lessons learned from building a 3D system for novices

Matthew Conway; Steve Audia; Tommy Burnette; Dennis Cosgrove; Kevin Christiansen

We present lessons learned from developing Alice, a 3D graphics programming environment designed for undergraduates with no 3D graphics or programming experience. Alice is a Windows 95/NT tool for describing the time-based and interactive behavior of 3D objects, not a CAD tool for creating object geometry. Our observations and conclusions come from formal and informal observations of hundreds of users. Primary results include the use of LOGO-style egocentric coordinate systems, the use of arbitrary objects as lightweight coordinate systems, the launching of implicit threads of execution, extensive function overloading for a small set of commands, the careful choice of command names, and the ubiquitous use of animation and undo.


Presence: Teleoperators & Virtual Environments | 1992

A literature survey for virtual environments: Military flight simulator visual systems and simulator sickness

Randy Pausch; Thomas Crea; Matthew Conway

Researchers in the field of virtual environments (VE), or virtual reality, surround a participant with synthetic stimuli, The flight simulator community, primarily in the U.S. military, has a great deal of experience with aircraft simulations, and VE researchers should be aware of the major results in this field. In this survey of the literature, we have especially focused on military literature that may be hard for traditional academics to locate via the standard journals. One of the authors of this paper is a military helicopter pilot himself, which was quite useful in obtaining access to many of our references. We concentrate on research that produces specific, measured results that apply to VE research. We assume no background other than basic knowledge of computer graphics, and explain simulator terms and concepts as necessary. This paper ends with an annotated bibliography of some harder to find research results in the field of flight simulators: • The effects of display parameters, including field-of-view and scene complexity; • The effect of lag in system response; • The effect of refresh rate in graphics update; • The existing theories on causes of simulator sickness; and • The after-effects of simulator use Many of the results we cite are contradictory. Our global observation is that with flight simulator research, like most human-computer interaction research, there are very few correct answers. Almost always, the answer to a specific question depends on the task the user was attempting to perform with the simulator.


user interface software and technology | 1999

The VideoMouse: a camera-based multi-degree-of-freedom input device

Ken Hinckley; Michael J. Sinclair; Erik Hanson; Richard Szeliski; Matthew Conway

The VideoMouse is a mouse that uses a camera as its input sensor. A real-time vision algorithm determines the six degree-of-freedom mouse posture, consisting of 2D motion, tilt in the forward/back and left/right axes, rotation of the mouse about its vertical axis, and some limited height sensing. Thus, a familiar 2D device can be extended for three-dimensional manipulation, while remaining suitable for standard 2D GUI tasks. We describe techniques for mouse functionality, 3D manipulation, navigating large 2D spaces, and using the camera for lightweight scanning tasks.


interactive 3d graphics and games | 1999

Toolspaces and glances: storing, accessing, and retrieving objects in 3D desktop applications

Jeffrey S. Pierce; Matthew Conway; Maarten van Dantzich; George G. Robertson

Users of 3D desktop applications perform tasks that require accessing data storage, moving objects, and navigation. These operations are typically performed using 2D GUI elements or 3D widgets. We wish to focus on interaction with 3D widgets directly in the 3D world, rather than forcing our users to repeatedly switch contexts between 2D and 3D. However, the use of 3D widgets requires a mechanism for storing, accessing, and retrieving these widgets. In this paper we present foolspuces and glances to provide this capability for 3D widgets and other objects in interactive 3D worlds. Toolspaces are storage spaces attached to the user’s virtual body; objects placed in these spaces are always accessible yet out of the user’s view until needed. Users access these toolspaces to store and retrieve objects through a type of lightweight and ephemeral navigation we call glances. CR


human factors in computing systems | 1994

Alice: a rapid prototyping system for building virtual environments

Matthew Conway; Randy Pausch; Rich Gossweiler; Tommy Burnette

Alice is a rapid prototyping system used to create three dimensional graphics simulations like those seen in virtual reality applications. Alice uses an interpreted language called Python to implement the semantics of user actions. This interactive development environment allows users to explore many more design options than is possible in a compiled language environment.


human factors in computing systems | 1994

Toolglass and magic lenses: the see-through interface

Eric A. Bier; Maureen C. Stone; Kenneth A. Pier; Kenneth P. Fishkin; Thomas Baudel; Matthew Conway; William Buxton; Tony DeRose

Toolglass widgets are new user interface tools that can appear, as though on a transparent sheet of glass, between an application and a traditional cursor. They can be positioned with one hand while the other positions the cursor. The widgets provide a rich and concise vocabulary for operating on application objects. These widgets may incorporate visual filters, called Magic Lens filters, that modify the presentation of application objects to reveal hidden information, to enhance data of interest, or to suppress distracting information. Together, these tools form a see-through interface that offers many advantages over traditional controls. They provide a new style of interaction that better exploits the user’s everyday skills. They can reduce steps, cursor motion, and errors. Many widgets can be provided in a user interface, by designers and by users, without requiring dedicated screen space. In addition, lenses provide rich context-dependent feedback and the ability to view details and context simultaneously. Our widgets and lenses can be combined to form operation and viewing macros, and can be used over multiple applications. CR


Presence: Teleoperators & Virtual Environments | 1994

One-dimensional motion tailoring for the disabled: A user study

Matthew Conway; Laura Vogtle; Randy Pausch

The Tailor project allows physically disabled users to provide real-time analog input to computer applications. We use a Polhemus™ tracking device and create a custom tailored mapping from each users best range and type of motion into the analog control signal. The application is a simple video game based on Pong, where the analog input controls the position of the players paddle. A group of able-bodied subjects was able to correctly hit the ball with the paddle 77% of the time, and a comparison group of children with cerebral palsy performed at the 50% level. More than half the disabled users were able to perform at a higher level than the worst able-bodied user.

Collaboration


Dive into the Matthew Conway's collaboration.

Top Co-Authors

Avatar

Randy Pausch

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Dennis Cosgrove

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Tommy Burnette

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Jeffrey S. Pierce

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Jim Durbin

United States Naval Research Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge