Wolfgang Büschel
Dresden University of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Wolfgang Büschel.
interactive tabletops and surfaces | 2012
Martin Spindler; Wolfgang Büschel; Raimund Dachselt
Tangible Windows are a novel concept for interacting with virtual 3D information spaces in a workbench-like multi-display environment. They allow for performing common 3D interaction tasks in a more accessible manner by combining principles of tangible interaction, head-coupled perspective, and multi-touch techniques. Tangible Windows unify the interaction and representation space in a single device. They either act as physical peepholes into a virtual 3D world or as physical containers for parts of that world and are well-suited for the collaborative exploration and manipulation of such information spaces. One important feature of Tangible Windows is that the use of obtrusive hardware, such as HMDs, is strictly avoided. Instead, lightweight paper-based displays are used. We present different techniques for canonical 3D interaction tasks such as viewport control or object selection and manipulation, based on the combination of independent input modalities. We tested these techniques on a self-developed prototype system and received promising early user feedback.
interactive tabletops and surfaces | 2014
Ulrich von Zadow; Wolfgang Büschel; Ricardo Langner; Raimund Dachselt
We present SleeD, a touch-sensitive Sleeve Display that facilitates interaction with multi-touch display walls. Large vertical displays allow multiple users to interact effectively with complex data but are inherently public. Also, they generally cannot present an interface adapted to the individual user. The combination with an arm-mounted, interactive display allows complex personalized interactions. In contrast to hand-held devices, both hands remain free for interacting with the wall. We discuss different levels of coupling between wearable and wall and propose novel user interface techniques that support user-specific interfaces, data transfer, and arbitrary personal views. In an iterative development process, we built a mock-up using a bendable e-Ink display and a fully functional prototype based on an arm-mounted smartphone. In addition, we developed several applications that showcase the techniques presented. An observational study we conducted demonstrates the high potential of our concepts.
ubiquitous computing | 2014
Martin Spindler; Wolfgang Büschel; Charlotte Winkler; Raimund Dachselt
Spatially aware handheld displays are a promising approach to interact with complex information spaces in a more natural way by extending the interaction space from the 2D surface to the 3D physical space around them. This is achieved by utilizing their spatial position and orientation for interaction purposes. Technical solutions for spatially tracked displays already exist in research laboratories, e.g., embedded in a tabletop environment. Along with a large stationary screen, such multi-display systems provide a rich design space with a variety of benefits to users, e.g., the explicit support of co-located parallel work and collaboration. As we see a great future in the underlying interaction principles, the question is how the technology can be made accessible to the public. With our work, we want to address this issue. In the long term, we envision a low-cost tangible display ecosystem that is suitable for everyday usage and supports both active displays (e.g., the iPad) and passive projection media (e.g., paper screens and everyday objects such as a mug). The two major contributions of this article are a presentation of an exciting design space and a requirement analysis regarding its technical realization with special focus on a broad adoption by the public. In addition, we present a proof of concept system that addresses one technical aspect of this ecosystem: the spatial tracking of tangible displays with a consumer depth camera (Kinect).
international symposium on mixed and augmented reality | 2014
Kai Rohmer; Wolfgang Büschel; Raimund Dachselt; Thorsten Grosch
Mobile devices become more and more important today, especially for augmented reality (AR) applications in which the camera of the mobile device acts like a window into the mixed reality world. Up to now, no photorealistic augmentation is possible since the computational power of the mobile devices is still too weak. Even a streaming solution from a stationary PC would cause a latency that affects user interactions considerably. Therefore, we introduce a differential illumination method that allows for a consistent illumination of the inserted virtual objects on mobile devices, avoiding a delay. The necessary computation effort is shared between a stationary PC and the mobile devices to make use of the capacities available on both sides. The method is designed such that only a minimum amount of data has to be transferred asynchronously between the stationary PC and one or multiple mobile devices. This allows for an interactive illumination of virtual objects with a consistent appearance under both temporally and spatially varying real illumination conditions. To describe the complex near-field illumination in an indoor scenario, multiple HDR video cameras are used to capture the illumination from multiple directions. In this way, sources of illumination can be considered that are not directly visible to the mobile device because of occlusions and the limited field of view of built-in cameras.
designing interactive systems | 2014
Juan David Hincapié-Ramos; Sophie Roscher; Wolfgang Büschel; Ulrike Kister; Raimund Dachselt; Pourang Irani
As a novel class of mobile devices with rich interaction capabilities we introduce tPads -- transparent display tablets. tPads are the result of a systematic design investigation into the ways and benefits of interacting with transparent mobiles which goes beyond traditional mobile interactions and augmented reality (AR) applications. Through a user-centered design process we explored interaction techniques for transparent-display mobiles and classified them into four categories: overlay, dual display & input, surface capture and model-based interactions. We investigated the technical feasibility of such interactions by designing and building two touch-enabled semi-transparent tablets called tPads and a range of tPad applications. Further, a user study shows that tPad interactions applied to everyday mobile tasks (application switching and image capture) outperform current mobile interactions and were preferred by users. Our hands-on design process and experimental evaluation demonstrate that transparent displays provide valuable interaction opportunities for mobile devices.
advanced visual interfaces | 2014
Wolfgang Büschel; Ulrike Kister; Mathias Frisch; Raimund Dachselt
In many cases, Tangible User Interfaces allow the manipulation of digital content with physical objects recognized by an interactive tabletop. Usually, such tangible objects are made of opaque wood or synthetic materials, thereby occluding the display. In this paper, we systematically investigate the promising potential of tangibles entirely made of transparent or translucent materials. Besides visualizing content directly below a manipulable tangible, transparent objects also facilitate direct touch interaction with the content below, dynamic illumination and glowing effects. We propose a comprehensive design space for transparent tangibles on tabletops based on a thorough review of existing work. By reporting on our own experiments and prototypes, we address several gaps in this design space, regarding aspects of both interaction and visualization. These include the illumination of tangibles as well as the precise input with transparent tangibles for which we also present the promising results of an initial user study. Finally, benefits and shortcomings of transparent tangibles are discussed and resulting design considerations are presented.
Proceedings of the 2017 ACM International Conference on Interactive Surfaces and Spaces | 2017
Wolfgang Büschel; Patrick Reipschläger; Ricardo Langner; Raimund Dachselt
Three-dimensional visualizations employing traditional input and output technologies have well-known limitations. Immersive technologies, natural interaction techniques, and recent developments in data physicalization may help to overcome these issues. In this context, we are specifically interested in the usage of spatial interaction with mobile devices for improved 3D visualizations. To contribute to a better understanding of this interaction style, we implemented example visualizations on a spatially-tracked tablet and investigated their usage and potential. In this paper, we report on a qualitative study comparing spatial interaction with inplace 3D visualizations to classic touch interaction regarding typical visualization tasks: navigation of unknown datasets, comparison of individual data objects, and the understanding and memorization of structures in the data. We identify several distinct usage patterns and derive recommendations for using spatial interaction in 3D data visualization.
IEEE Transactions on Visualization and Computer Graphics | 2015
Kai Rohmer; Wolfgang Büschel; Raimund Dachselt; Thorsten Grosch
At present, photorealistic augmentation is not yet possible since the computational power of mobile devices is insufficient. Even streaming solutions from stationary PCs cause a latency that affects user interactions considerably. Therefore, we introduce a differential rendering method that allows for a consistent illumination of the inserted virtual objects on mobile devices, avoiding delays. The computation effort is shared between a stationary PC and the mobile devices to make use of the capacities available on both sides. The method is designed such that only a minimum amount of data has to be transferred asynchronously between the participants. This allows for an interactive illumination of virtual objects with a consistent appearance under both temporally and spatially varying real illumination conditions. To describe the complex near-field illumination in an indoor scenario, HDR video cameras are used to capture the illumination from multiple directions. In this way, sources of illumination can be considered that are not directly visible to the mobile device because of occlusions and the limited field of view. While our method focuses on Lambertian materials, we also provide some initial approaches to approximate non-diffuse virtual objects and thereby allow for a wider field of application at nearly the same cost.
Proceedings of the 2017 ACM International Conference on Interactive Surfaces and Spaces | 2017
Konstantin Klamka; Wolfgang Büschel; Raimund Dachselt
In this paper, we demonstrate IllumiPaper: a system that provides new forms of paper-integrated visual feedback and enables multiple input channels to enhance digital paper applications. We aim to take advantage of traditional form sheets, including their haptic qualities, simplicity, and archivability, and simultaneously integrate rich digital functionalities such as dynamic status queries, real-time notifications, and visual feedback for widget controls. Our approach builds on emerging, novel paper-based technologies. We describe a fabrication process that allow us to directly integrate segment-based displays, touch and flex sensors, as well as digital pen input on the paper itself. With our fully functional research platform we demonstrate an interactive prototype for an industrial form-filling maintenance application to service computer networks that covers a wide range of typical paper-related tasks.
Proceedings of the 2016 ACM International Conference on Interactive Surfaces and Spaces | 2016
Wolfgang Büschel; Patrick Reipschläger; Raimund Dachselt
By now, mobile 3D interaction is often limited to simple multi-touch input on standard devices and less expressive or hard to use. We present the concept of mobile dual-display devices that can be folded for the exploration of 3D content. We examine different display modes and introduce new presentation and 3D interaction techniques that make use of the special form factor and the added input modality of folding two displays. In particular, we also consider the advantages of our proposed device for head-coupled perspective rendering -- virtually extending the view and providing independent perspectives for two users.