Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Katrin Wolf is active.

Publication


Featured researches published by Katrin Wolf.


international conference on human computer interaction | 2011

Taxonomy of microinteractions: defining microgestures based on ergonomic and scenario-dependent requirements

Katrin Wolf; Anja Naumann; Michael Rohs; Jörg Müller

This paper explores how microgestures can allow us to execute a secondary task, for example controlling mobile applications, without interrupting the manual primary task, for instance, driving a car. In order to design microgestures iteratively, we interviewed sports- and physiotherapists while asking them to use task related props, such as a steering wheel, a cash card, and a pen for simulating driving a car, an ATM scenario, and a drawing task. The primary objective here is to define microgestures that are easily performable without interrupting or interfering the primary task. Using expert interviews, we developed a taxonomy that classifies these gestures according to their task context. We also assessed the ergonomic and attentional attributes that influence the feasibility and task suitability of microinteractions, and evaluated their level of resources required. Accordingly, we defined 21 microgestures that allow performing microinteractions within a manual, dual task context. Our taxonomy poses a basis for designing microinteraction techniques.


tangible and embedded interaction | 2012

PinchPad: performance of touch-based gestures while grasping devices

Katrin Wolf; Christian Müller-Tomfelde; Kelvin Cheng; Ina Wechsung

This paper focuses on combining front and back device interaction on grasped devices, using touch-based gestures. We designed generic interactions for discrete, continuous, and combined gesture commands that are executed without hand-eye control because the performing fingers are hidden behind a grasped device. We designed the interactions in such a way that the thumb can always be used as a proprioceptive reference for guiding finger movements, applying embodied knowledge about body structure. In a user study, we tested these touch-based interactions for their performance and users task-load perception. We combined two iPads together back-to-back to form a double-sided touch screen device: the PinchPad. We discuss the main errors that led to a decrease in accuracy, identify stable features that reduce the error rate, and discuss the role of body schema in designing gesture-based interactions where the user cannot see their hands properly.


human factors in computing systems | 2012

Does proprioception guide back-of-device pointing as well as vision?

Katrin Wolf; Christian Müller-Tomfelde; Kelvin Cheng; Ina Wechsung

We present research that investigates the amount of guidance required by users for precise back-of-device interaction. We explore how pointing effectiveness is influenced by the presence or absence of visual guidance feedback. Participants were asked to select targets displayed on an iPad device, by touching and releasing them from underneath the device. Another iPad was used to detect finger positions from the rear. Results showed that participants were able to select targets as accurately without visual feedback of finger position as they were with it. Additionally, no significant increase in workload was identified when visual feedback was removed. Our results show that users do not require complex techniques to visualize finger position on the rear of device. Visual feedback does not affect any performance parameters, such as effectiveness, perceived performance, and the number of trials needed to select a target. We also outline the implications of our findings and our future work to fully investigate the effect of visual guidance feedback.


augmented human international conference | 2013

Whole hand modeling using 8 wearable sensors: biomechanics for hand pose prediction

Christopher-Eyk Hrabia; Katrin Wolf; Mathias Wilhelm

Although Data Gloves allow for the modeling of the human hand, they can lead to a reduction in usability as they cover the entire hand and limit the sense of touch as well as reducing hand feasibility. As modeling the whole hand has many advantages (e.g. for complex gesture detection) we aim for modeling the whole hand while at the same time keeping the hands natural degrees of freedom (DOF) and the tactile sensibility as high as possible while allowing for manual tasks like grasping tools and devices. Therefore, we attach motion sensor boards (accelerometer, magnetometer and gyroscope) to the human hand. We conducted a user study and found the biomechanical dependence of the joint angles between the fingertip close joint (DIP) and the palm close joint (PIP) in a relation of DIP = 0.88 PIP for all four fingers (SD=0.10, R2=0.77). This allows the data glove to be reduced by 8 sensors boards, one per finger, three for the thumb, and one on the back of the hand as an orientation baseline for modeling the whole hand through. Even though we found a joint flexing relationship also for the thumb, we decided to retain 3 sensor units here, as the relationship varied more (R2=0.59). Our hand model could potentially serve for rich handmodel-based gestural interaction as it covers all 26 DOF in the human hand.


tangible and embedded interaction | 2013

Tickle: a surface-independent interaction technique for grasp interfaces

Katrin Wolf; Robert Schleicher; Sven G. Kratz; Michael Rohs

We present a wearable interface that consists of motion sensors. As the interface can be worn on the users fingers (as a ring) or fixed to it (with nail polish), the device controlled by finger gestures can be any generic object, provided they have an interface for receiving the sensors signal. We implemented four gestures: tap, release, swipe, and pitch, all of which can be executed with a finger of the hand holding the device. In a user study we tested gesture appropriateness for the index finger at the back of a handheld tablet that offered three different form factors on its rear: flat, convex, and concave (undercut). For all three shapes, the gesture performance was equally good, however pitch performed better on all surfaces than swipe. The proposed interface is an example towards the idea of ubiquitous computing and the vision of seamless interactions with grasped objects. As an initial application scenario we implemented a camera control that allows the brightness to be configured using our tested gestures on a common SLR device.


human computer interaction with mobile devices and services | 2010

Foogue: eyes-free interaction for smartphones

Christina Dicke; Katrin Wolf; Yaroslav Tal

Graphical user interfaces for mobile devices have several drawbacks in mobile situations. In this paper, we present Foogue, an eyes-free interface that utilizes spatial audio and gesture input. Foogue does not require visual attention and hence does not divert visual attention from the task at hand. Foogue has two modes, which are designed to fit the usage patterns of mobile users. For user input we designed a gesture language build of a limited number of simple but also easy to differentiate gesture elements.


tangible and embedded interaction | 2011

Touching the void: gestures for auditory interfaces

Katrin Wolf; Christina Dicke; Raphael Grasset

Nowadays, mobile devices provide new possibilities for gesture interaction due to the large range of embedded sensors they have and their physical form factor. In addition, auditory interfaces can now be more easily supported through advanced mobile computing capabilities. Although different types of gesture techniques have been proposed for handheld devices, there is still little knowledge about the acceptability and use of some of these techniques, especially in the context of an auditory interface. In this paper, we propose a novel approach to the problem by studying the design space of gestures proposed by end-users for a mobile auditory interface. We discuss the results of this explorative study, in terms of the scope of the gestures proposed, the tangible aspects, and the users preferences. This study delivers some initial gestures recommendations for eyes-free auditory interfaces.


human factors in computing systems | 2015

Illusion of Surface Changes Induced by Tactile and Visual Touch Feedback

Katrin Wolf; Timm Bäder

The work presented here aims to enrich material perception when touching interactive surfaces. This is realized through simulating changes in the perception of various material properties, such as softness and bendability. The thereby created perceptual illusions of surface changes are induced using electrotactile stimuli and texture projection as touch/pressure feedback. A metal plate with an embedded electrode was used to provide the user with electrotactile stimuli when touching the surface with a finger that is also equipped with an electrode. The distortion of material textures projected on the touched surface was used to visually simulate surface deformations. We show through an experiment that both, electrotactile and visual feedback can induce the illusion of surface deformation when provided separately. When tactile and visual touch feedback is presented at the same time, the perception of surface changes does not increase compared to just using one feedback modality only.


human computer interaction with mobile devices and services | 2012

A study of on-device gestures

Katrin Wolf; Marilyn Rose McGee-Lennon; Stephen A. Brewster

Regardless of how gestural phone interaction (like pinching on a touch screen for content zooming) is implemented in almost any mobile device; there are still no design guidelines for gestural control. These should be designed with respect to ergonomics and hand anatomy. There are many human-side aspects to take care of when designing gestures. We evaluate gestures regarding the ergonomic aspects while interacting with mobile devices and present ergonomic requirements of finger gestures on the back and side of a vertically and as well as horizontally hand-held phone, such as dragging and lifting fingers from the surface. The results suggest that drag and lift gestures have the potential to be executed one-handed while using the phone and that certain device configurations may be accessed seamlessly with that type of gesture control.


tangible and embedded interaction | 2011

Microinteractions beside ongoing manual tasks

Katrin Wolf

This paper explores how microinteractions as finger gestures allow executing a secondary task without interrupting the manual primary tasks such as driving a car or using a smart stylus. An analyses of Bocks Grip Taxonomy helps to identify manual primary tasks that have a huge benefit of not being interrupted by secondary tasks to control mobile applications and devices. This vision could offer the possibility to use the mobile phone safely while holding a steering wheel of the car as well as augment the functionality of a smart stylus such as change the stroke width without stopping to write or to draw. After discussing some in this research field used tracking technologies, such as EMG or depth camera, we explore the benefits and device hardware of our prototype, which uses accelerometers to track finger gestures without disable any hand-skills, like its flexibility or tactile sense.

Collaboration


Dive into the Katrin Wolf's collaboration.

Top Co-Authors

Avatar

Niels Henze

University of Stuttgart

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Markus Funk

Technische Universität Darmstadt

View shared research outputs
Top Co-Authors

Avatar

Mathias Wilhelm

Technical University of Berlin

View shared research outputs
Top Co-Authors

Avatar

Sven Mayer

University of Stuttgart

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge