Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Sven G. Kratz is active.

Publication


Featured researches published by Sven G. Kratz.


International Journal of Human-computer Studies \/ International Journal of Man-machine Studies | 2011

I'm home: Defining and evaluating a gesture set for smart-home control

Christine Kühnel; Tilo Westermann; Fabian Hemmert; Sven G. Kratz; Alexander Müller; Sebastian Möller

Abstract Mobile phones seem to present the perfect user interface for interacting with smart environments, e.g. smart-home systems, as they are nowadays ubiquitous and equipped with an increasing amount of sensors and interface components, such as multi-touch screens. After giving an overview on related work this paper presents the adapted design methodology proposed by Wobbrock et al. (2009) for the development of a gesture-based user interface to a smart-home system. The findings for the new domain, device and gesture space are presented and compared to findings by Wobbrock et al. (2009) . Three additional steps are described: A small pre-test survey, a mapping and a memory test and a performance test of the implemented system. This paper shows the adaptability of the approach described by Wobbrock et al. (2009) for three-dimensional gestures in the smart-home domain. Elicited gestures are described and a first implementation of a user interface based on these gestures is presented.


human computer interaction with mobile devices and services | 2009

HoverFlow: expanding the design space of around-device interaction

Sven G. Kratz; Michael Rohs

In this paper we explore the design space of around-device interaction (ADI). This approach seeks to expand the interaction possibilities of mobile and wearable devices beyond the confines of the physical device itself to include the space around it. This enables rich 3D input, comprising coarse movement-based gestures, as well as static position-based gestures. ADI can help to solve occlusion problems and scales down to very small devices. We present a novel around-device interaction interface that allows mobile devices to track coarse hand gestures performed above the devices screen. Our prototype uses infrared proximity sensors to track hand and finger positions in the devices proximity. We present an algorithm for detecting hand gestures and provide a rough overview of the design space of ADI-based interfaces.


human factors in computing systems | 2010

Characteristics of pressure-based input for mobile devices

Craig D. Stewart; Michael Rohs; Sven G. Kratz; Georg Essl

We conducted a series of user studies to understand and clarify the fundamental characteristics of pressure in user interfaces for mobile devices. We seek to provide insight to clarify a longstanding discussion on mapping functions for pressure input. Previous literature is conflicted about the correct transfer function to optimize user performance. Our study results suggest that the discrepancy can be explained by different signal conditioning circuitry and with improved signal conditioning the user-performed precision relationship is linear. We also explore the effects of hand pose when applying pressure to a mobile device from the front, the back, or simultaneously from both sides in a pinching movement. Our results indicate that grasping type input outperforms single-sided input and is competitive with pressure input against solid surfaces. Finally we provide an initial exploration of non-visual multimodal feedback, motivated by the desire for eyes-free use of mobile devices. The findings suggest that non-visual pressure input can be executed without degradation in selection time but suffers from accuracy problems.


human factors in computing systems | 2012

ShoeSense: a new perspective on gestural interaction and wearable applications

Gilles Bailly; Jörg Müller; Michael Rohs; Daniel Wigdor; Sven G. Kratz

When the user is engaged with a real-world task it can be inappropriate or difficult to use a smartphone. To address this concern, we developed ShoeSense, a wearable system consisting in part of a shoe-mounted depth sensor pointing upward at the wearer. ShoeSense recognizes relaxed and discreet as well as large and demonstrative hand gestures. In particular, we designed three gesture sets (Triangle, Radial, and Finger-Count) for this setup, which can be performed without visual attention. The advantages of ShoeSense are illustrated in five scenarios: (1) quickly performing frequent operations without reaching for the phone, (2) discreetly performing operations without disturbing others, (3) enhancing operations on mobile devices, (4) supporting accessibility, and (5) artistic performances. We present a proof-of-concept, wearable implementation based on a depth camera and report on a lab study comparing social acceptability, physical and mental demand, and user preference. A second study demonstrates a 94-99% recognition rate of our recognizers.


human computer interaction with mobile devices and services | 2009

Hoverflow: exploring around-device interaction with IR distance sensors

Sven G. Kratz; Michael Rohs

By equipping a mobile device with distance sensing capabilities, we aim to expand the interaction possibilities of mobile and wearable devices beyond the confines of the physical device itself to include the space immediately around it. Our prototype, an Apple iPhone equipped with six IR distance sensors, allows for rich 3D input, comprising coarse movement-based hand gestures, as well as static position-based gestures. A demonstration application, HoverFlow, illustrates the use of coarse hand gestures for interaction with mobile applications. This type of interaction, which we call Around-Device Interaction (ADI) has the potential to help to solve occlusion problems on small-screen mobile devices and scales well to small device sizes.


intelligent user interfaces | 2010

A

Sven G. Kratz; Michael Rohs

We present the


intelligent user interfaces | 2011

3 gesture recognizer: simple gesture recognition for devices equipped with 3D acceleration sensors

Sven G. Kratz; Michael Rohs

3 Gesture Recognizer, a simple but robust gesture recognition system for input devices featuring 3D acceleration sensors. The algorithm is designed to be implemented quickly in prototyping environments, is intended to be device-independent and does not require any special toolkits or frameworks. It relies solely on simple trigonometric and geometric calculations. A user evaluation of our system resulted in a correct gesture recognition rate of 80%, when using a set of 10 unique gestures for classification. Our method requires significantly less training data than other gesture recognizers and is thus suited to be deployed and to deliver results rapidly.


human factors in computing systems | 2011

Protractor3D: a closed-form solution to rotation-invariant 3D gestures

Sven G. Kratz; Tilo Westermann; Michael Rohs; Georg Essl

Protractor 3D is a gesture recognizer that extends the 2D touch screen gesture recognizer Protractor to 3D gestures. It inherits many of Protractors desirable properties, such as high recognition rate, low computational and low memory requirements, ease of implementation, ease of customization, and low number of required training samples. Protractor 3D is based on a closed-form solution to finding the optimal rotation angle between two gesture traces involving quaternions. It uses a nearest neighbor approach to classify input gestures. It is thus well-suited for application in resource-constrained mobile devices. We present the design of the algorithm and a study that evaluated its performance.


advanced visual interfaces | 2012

CapWidgets: tangile widgets versus multi-touch controls on mobile devices

Sven G. Kratz; Michael Rohs; Dennis Guse; Jörg Müller; Gilles Bailly; Michael Nischt

We present CapWidgets, passive tangible controls for capacitive touch screens. CapWidgets bring back physical controls to off-the-shelf multi-touch surfaces as found in mobile phones and tablet computers. While the user touches the widget, the surface detects the capacitive marker on the widgets underside. We study the relative performance of this tangible interaction with direct multi-touch interaction and our experimental results show that user performance and preferences are not automatically in favor of tangible widgets and careful design is necessary to validate their properties.


human factors in computing systems | 2011

PalmSpace: continuous around-device gestures vs. multitouch for 3D rotation tasks on mobile devices

Alireza Sahami Shirazi; Michael Rohs; Robert Schleicher; Sven G. Kratz; Alexander Müller; Albrecht Schmidt

Rotating 3D objects is a difficult task on mobile devices, because the task requires 3 degrees of freedom and (multi-)touch input only allows for an indirect mapping. We propose a novel style of mobile interaction based on mid-air gestures in proximity of the device to increase the number of DOFs and alleviate the limitations of touch interaction with mobile devices. While one hand holds the device, the other hand performs mid-air gestures in proximity of the device to control 3D objects on the mobile devices screen. A flat hand pose defines a virtual surface which we refer to as the PalmSpace for precise and intuitive 3D rotations. We constructed several hardware prototypes to test our interface and to simulate possible future mobile devices equipped with depth cameras. We conducted a user study to compare 3D rotation tasks using the most promising two designs for the hand location during interaction -- behind and beside the device -- with the virtual trackball, which is the current state-of-art technique for orientation manipulation on touch-screens. Our results show that both variants of PalmSpace have significantly lower task completion times in comparison to the virtual trackball.

Collaboration


Dive into the Sven G. Kratz's collaboration.

Top Co-Authors

Avatar

Don Kimber

FX Palo Alto Laboratory

View shared research outputs
Top Co-Authors

Avatar

Georg Essl

University of Michigan

View shared research outputs
Top Co-Authors

Avatar

Jim Vaughan

FX Palo Alto Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Don Severns

University of California

View shared research outputs
Top Co-Authors

Avatar

Gerry Filby

FX Palo Alto Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Patrick Chiu

FX Palo Alto Laboratory

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge