Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where David Dobbelstein is active.

Publication


Featured researches published by David Dobbelstein.


human factors in computing systems | 2014

Pervasive information through constant personal projection: the ambient mobile pervasive display (AMP-D)

Christian Winkler; Julian Seifert; David Dobbelstein; Enrico Rukzio

The vision of pervasive ambient information displays which show relevant information has not yet come true. One of the main reasons is the limited number of available displays in the environment which is a fundamental requirement of the original vision. We introduce the concept of an Ambient Mobile Pervasive Display AMP-D which is a wearable projector system that constantly projects an ambient information display in front of the user. The floor display provides serendipitous access to public and personal information. The display is combined with a projected display on the users hand, forming a continuous interaction space that is controlled by hand gestures. The paper introduces this novel device concept, discusses its interaction design, and explores its advantages through various implemented application examples. Furthermore, we present the AMP-D prototype which illustrates the involved challenges concerning hardware, sensing, and visualization.


human factors in computing systems | 2015

Belt: An Unobtrusive Touch Input Device for Head-worn Displays

David Dobbelstein; Philipp Hock; Enrico Rukzio

Belt is a novel unobtrusive input device for wearable displays that incorporates a touch surface encircling the users hip. The wide input space is leveraged for a horizontal spatial mapping of quickly accessible information and applications. We discuss social implications and interaction capabilities for unobtrusive touch input and present our hardware implementation and a set of applications that benefit from the quick access time. In a qualitative user study with 14 participants we found out that for short interactions (2-4 seconds), most of the surface area is considered as appropriate input space, while for longer interactions (up to 10 seconds), the front areas above the trouser pockets are preferred.


user interface software and technology | 2016

FaceTouch: Enabling Touch Interaction in Display Fixed UIs for Mobile Virtual Reality

Jan Gugenheimer; David Dobbelstein; Christian Winkler; Gabriel Haas; Enrico Rukzio

We present FaceTouch, a novel interaction concept for mobile Virtual Reality (VR) head-mounted displays (HMDs) that leverages the backside as a touch-sensitive surface. With FaceTouch, the user can point at and select virtual content inside their field-of-view by touching the corresponding location at the backside of the HMD utilizing their sense of proprioception. This allows for rich interaction (e.g. gestures) in mobile and nomadic scenarios without having to carry additional accessories (e.g. a gamepad). We built a prototype of FaceTouch and conducted two user studies. In the first study we measured the precision of FaceTouch in a display-fixed target selection task using three different selection techniques showing a low error rate of 2% indicate the viability for everyday usage. To asses the impact of different mounting positions on the user performance we conducted a second study. We compared three mounting positions of the touchpad (face, hand and side) showing that mounting the touchpad at the back of the HMD resulted in a significantly lower error rate, lower selection time and higher usability. Finally, we present interaction techniques and three example applications that explore the FaceTouch design space.


human factors in computing systems | 2015

Glass Unlock: Enhancing Security of Smartphone Unlocking through Leveraging a Private Near-eye Display

Christian Winkler; Jan Gugenheimer; Alexander De Luca; Gabriel Haas; Philipp Speidel; David Dobbelstein; Enrico Rukzio

This paper presents Glass Unlock, a novel concept using smart glasses for smartphone unlocking, which is theoretically secure against smudge attacks, shoulder-surfing, and camera attacks. By introducing an additional temporary secret like the layout of digits that is only shown on the private near-eye display, attackers cannot make sense of the observed input on the almost empty phone screen. We report a user study with three alternative input methods and compare them to current state-of-the-art systems. Our findings show that Glass Unlock only moderately increases authentication times and that users favor the input method yielding the slowest input times as it avoids focus switches between displays.


human factors in computing systems | 2014

SurfacePhone: a mobile projection device for single- and multiuser everywhere tabletop interaction

Christian Winkler; Markus Löchtefeld; David Dobbelstein; Antonio Krüger; Enrico Rukzio

To maintain a mobile form factor, the screen real estate of a mobile device canIn this paper we present SurfacePhone; a novel configuration of a projector phone which aligns the projector to project onto a physical surface to allow tabletop-like interaction in a mobile setup. The projection is created behind the upright standing phone and is touch and gesture-enabled. Multiple projections can be merged to create shared spaces for multi-user collaboration. We investigate this new setup, starting with the concept that we evaluated with a concept prototype. Furthermore we present our technical prototype, a mobile phone case with integrated projector that allows for the aforementioned interaction. We discuss its technical requirements and evaluate the accuracy of interaction in a second user study. We conclude with lessons learned and design guidelines.


ubiquitous computing | 2014

From the private into the public: privacy-respecting mobile interaction techniques for sharing data on surfaces

Julian Seifert; David Dobbelstein; Dominik Schmidt; Paul Holleis; Enrico Rukzio

Interactive horizontal surfaces provide large semi-public or public displays for colocated collaboration. In many cases, users want to show, discuss, and copy personal information or media, which are typically stored on their mobile phones, on such a surface. This paper presents three novel direct interaction techniques (Select&Place2Share, Select&Touch2Share, and Shield&Share) that allow users to select in private which information they want to share on the surface. All techniques are based on physical contact between mobile phone and surface. Users touch the surface with their phone or place it on the surface to determine the location for information or media to be shared. We compared these three techniques with the most frequently reported approach that immediately shows all media files on the table after placing the phone on a shared surface. The results of our user study show that such privacy-preserving techniques are considered as crucial in this context and highlight in particular the advantages of Select&Place2Share and Select&Touch2Share in terms of user preferences, task load, and task completion time.


human factors in computing systems | 2016

Unconstrained Pedestrian Navigation based on Vibro-tactile Feedback around the Wristband of a Smartwatch

David Dobbelstein; Philipp Henzler; Enrico Rukzio

We present a bearing-based pedestrian navigation approach that utilizes vibro-tactile feedback around the users wrist to convey information about the general direction of a target. Unlike traditional navigation, no route is pre-defined so that users can freely explore the surrounding. Our solution can be worn as a wristband for smartwatches or as a standalone device. We describe a mobile prototype with four tactors and show its feasibility in a preliminary navigation study.


user interface software and technology | 2014

Loupe: a handheld near-eye display

Kent Lyons; Seung Wook Kim; Shigeyuki Seko; David Nguyen; Audrey Desjardins; Mélodie Vidal; David Dobbelstein; Jeremy Rubin

Loupe is a novel interactive device with a near-eye virtual display similar to head-up display glasses that retains a handheld form factor. We present our hardware implementation and discuss our user interface that leverages Loupes unique combination of properties. In particular, we present our input capabilities, spatial metaphor, opportunities for using the round aspect of Loupe, and our use of focal depth. We demonstrate how those capabilities come together in an example application designed to allow quick access to information feeds.


international symposium on wearable computers | 2017

inScent: a wearable olfactory display as an amplification for mobile notifications

David Dobbelstein; Steffen Herrdum; Enrico Rukzio

We introduce inScent, a wearable olfactory display that can be worn in mobile everyday situations and allows the user to receive personal scented notifications, i.e. scentifications. Olfaction, i.e. the sense of smell, is used by humans as a sensorial information channel as an element for experiencing the environment. Olfactory sensations are closely linked to emotions and memories, but also notify about personal dangers such as fire or foulness. We want to utilize the properties of smell as a notification channel by amplifying received mobile notifications with artificially emitted scents. We built a wearable olfactory display that can be worn as a pendant around the neck and contains up to eight different scent aromas that can be inserted and quickly exchanged via small scent cartridges. Upon emission, scent aroma is vaporized and blown towards the user. A hardware - and software framework is presented that allows developers to add scents to their mobile applications. In a qualitative user study, participants wore the inScent wearable in public. We used subsequent semi-structured interviews and grounded theory to build a common understanding of the experience and derived lessons learned for the use of scentifications in mobile situations.


human factors in computing systems | 2016

FaceTouch: Touch Interaction for Mobile Virtual Reality

Jan Gugenheimer; David Dobbelstein; Christian Winkler; Gabriel Haas; Enrico Rukzio

We present FaceTouch, a mobile Virtual Reality (VR) headmounted display (HMD) that leverages the backside as a touch-sensitive surface. FaceTouch allows the user to point at and select virtual content inside their field-of-view by touching the corresponding location at the backside of the HMD utilizing their sense of proprioception. This allows for a rich interaction (e.g. gestures) in mobile and nomadic scenarios without having to carry additional accessories (e.g. gamepad). We built a prototype of FaceTouch and present interaction techniques and three example applications that leverage the FaceTouch design space.

Collaboration


Dive into the David Dobbelstein's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge