Steffi Beckhaus
University of Hamburg
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Steffi Beckhaus.
virtual reality software and technology | 2006
Ernst Kruijff; Dieter Schmalstieg; Steffi Beckhaus
This paper focuses at the usage of neuromuscular electrical stimulation (NMES) for achieving pseudo-haptic feedback. By stimulating the motor nerves, muscular contractions can be triggered that can be matched to a haptic event. Reflecting an initial user test, we will explain how this process can be realized, by investigating the physiological processes involved. Relating the triggered feedback to general haptics, its potential in future interfaces will be identified and laid out in a development roadmap.
Computer Graphics Forum | 2001
Steffi Beckhaus; Felix Ritter; Thomas Strothotte
Exploring unknown models or scenes is a highly interactive and dynamic process. Systems for automatic presentation of models or scenes either require cinematographic rules, direct human interaction, framesets, or pre‐calculation of paths to a known goal. In this paper we present a system which can deal with rapidly changing user interest in objects of a scene or model as well as with dynamic models and changes of the camera position introduced interactively by the user or through cuts. We describe CubicalPath, a new potential field‐based camera control system that helps with the exploration of virtual environments.
international conference on computer graphics and interactive techniques | 2004
Steffi Beckhaus; Ernst Kruijff
This course focuses on how we can use the potential of the human body in experimental or unconventional interface techniques. It explores the biological or physiological characteristics of the separate parts of the body, from head to toe, and from skin to heart, showing how their sensor (input) and control (output) capabilities can be applied to human computer interfaces. We demonstrate a wide variety of applications that make use proven interfaces as well as extremely experimental systems. Example systems vary from desktop--based to mixed and virtual reality, with applications from areas like art, entertainment, and science.Participants will learn to look beyond the restrictions bound to traditional multimedia systems by discovering and understanding how the human body can deliver great potential for new kinds of applications and systems. Theory on the human body and practical knowledge on (hardware) interfaces is balanced with a good portion of examples to provide a foundation for assessing and using experimental and unconventional interaction.
virtual reality software and technology | 2009
Robert W. Lindeman; Steffi Beckhaus
Much of Virtual Reality (VR) is about creating virtual worlds that are believable. But though the visual and audio experiences we provide today technically approach the limits of human sensory systems, there is still something lacking; something beyond sensory fidelity hinders us from fully buying into the worlds we experience through VR technology. We introduce the notion of Experiential Fidelity, which is an attempt to create a deeper sense of presence by carefully designing the user experience. We suggest to guide the users frame of mind in a way that their expectations, attitude, and attention are aligned with the actual VR experience, and that the users own imagination is stimulated to complete the experience. We propose to do this by structuring the time prior to exposure to increase anticipation, expectation, and the like.
symposium on 3d user interfaces | 2010
Kristopher J. Blom; Steffi Beckhaus
Virtual collisions are reportedly an important part of creating effective experiences of virtual environments. Although they are considered vital, collision responses for travel in a virtual environment are not well understood. The effectivity of methods for notifying users of collisions has not been explored in the context of travel and the methods used are often not even reported. We present novel notification methods, based on haptic feedback via an output device embedded in the floor of our display and an initial study that compares nine notification methods. In a comparative study, our haptic floor feedback methods were preferred, followed by a thump sound and a wand device rumble. The results indicate that methods that are context appropriate, e.g. haptic responses and audio cues similar to real collisions with the object, are clearly preferable for realistic impressions of a world and of collisions.
virtual reality software and technology | 2007
Kristopher J. Blom; Steffi Beckhaus
Virtual Realitys expanding adoption makes the creation of more interesting dynamic, interactive environments necessary in order to meet the expectations of users accustomed to modern computer games. In this paper, we present initial explorations of using the recently developed Functional Reactive Programming paradigm to support the creation of such environments. The Functional Reactive Programming paradigm supports these actions by providing tools that match both the users perception of the dynamics of the world and the underlying hybrid nature of such environments. Continuous functions with explicit time dependencies describe the dynamic behaviors of the environment and discrete event mechanisms provide for modifying the active behaviors of the environment. Initial examples show how this paradigm can be used to control dynamic, interactive Virtual Environments.
tangible and embedded interaction | 2008
Steffi Beckhaus; Roland Schröder-Kroll; Martin Berghoff
We present a novel, tangible interface demonstrated by means of the artwork, GranulatSynthese, an installation for the intuitive, tangible creation of ambient, meditative audio-visuals. The interface uses granules distributed over a tabletop surface and combines them with rear-projected visuals and dynamically selected sound samples. The haptic landscape can be explored with the hands, shaped into both hills and open space and composed intuitively. Form, position, and size of cleared table areas control parameters of the computer generated audio-visuals. GranulatSynthese is a meditative application, which invites to either play or step back, watching the visuals and sounds evolve. The installation has proven very accessible. It is inviting and absorbing for a long time for many visitors to the installation.
Teleoperators and Virtual Environments | 2012
Haringer Matthias; Steffi Beckhaus
In this paper we introduce novel methods of intensifying and varying the user experience in virtual environments (VE). VEs technically have numerous means for crafting the user experience. Little has yet been done to evaluate those means of expression (MoEs) for their emotional impact on people and to use their capability to create different experiences and subtly guide the user. One of the reasons is that this requires a system which is capable of easily and dynamically providing those MoEs in such a way that they can easily be composed, evaluated, and compared between applications and users. In the following, we first introduce our model of both informational and emotional impact of VEs on users, introduce our dynamic, expressive VR-system, and present our novel evaluation and rating method for MoEs. MoEs can be used to guide attention to specific objects or build up an emotion or mood over time. We then present a study in which users experience 30 selected MoEs and rate their qualitative emotional impact using this rating method. We found that different MoEs can be used to elicit many diverse emotions which were surprisingly consistent among the test persons. With these results, our work enables new ways to make VEs more interesting and emotionally engaging, especially over a longer period of time, opening new possibilities, for example, to increase the motivation for long, stressful, and tiresome training as in neurorehabilitation.
virtual reality software and technology | 2008
Nicolai Hess; Jan D. S. Wischweh; Kirsten Albrecht; Kristopher J. Blom; Steffi Beckhaus
The design and implementation of interactions in 3D environments remains a challenge. This is especially true for novices. Mechanisms to support the creation of interaction have been developed, but they lack a central metaphor that fits the natural way in which developers conceptualize interaction techniques. In this paper, we introduce a new framework whose design mirrors the essence of interaction throughout the Virtual Reality spectrum, where the user is literally in the center. It also reflects the way in which interactions are actually understood and described, based on the interactor and her actions. Based on the central metaphor of the interactor, an implementation that is composed of three phases is developed. Those phases are: input retrieval and shaping, interpretation of user intentions, and execution of changes to the environment. Through these divisions, software requirements like composition and reusability of components are satisfied. The resultant system ACTIF, an ACTor centric Interaction Framework, structures interaction development in a meaningful and understandable way and at the same time eases the design and creation of new and experimental interactions.
EGVE (Short Papers & Posters) | 2007
Kristopher J. Blom; Steffi Beckhaus
In this paper we introduce a VR system extension that focuses on the creatio n of interactive, dynamic Virtual Environments. The extension uses the recently developed programming co ncept, Functional Reactive Programming. This paradigm focuses on an explicit and more natural concept of time in th e modeling of dynamics, without sacrificing interactivity. We present an implementation that embeds the Functiona l Reactive Programming concept into a basic Virtual Reality system, VR Juggler.