Daniel Spelmezan
RWTH Aachen University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Daniel Spelmezan.
human factors in computing systems | 2009
Daniel Spelmezan; Mareike Jacobs; Anke Hilgers; Jan O. Borchers
While learning new motor skills, we often rely on feedback from a trainer. Auditive feedback and demonstrations are used most frequently, but in many domains they are inappropriate or impractical. We introduce tactile instructions as an alternative to assist in correcting wrong posture during physical activities, and present a set of full-body vibrotactile patterns. An initial study informed the design of our tactile patterns, and determined appropriate locations for feedback on the body. A second experiment showed that users perceived and correctly classified our tactile instruction patterns in a relaxed setting and during a cognitively and physically demanding task. In a final experiment, snowboarders on the slope compared their perception of tactile instructions with audio instructions under real-world conditions. Tactile instructions achieved overall high recognition accuracy similar to audio instructions. Moreover, participants responded quicker to instructions delivered over the tactile channel than to instructions presented over the audio channel. Our findings suggest that these full-body tactile feedback patterns can replace audio instructions during physical activities.
human factors in computing systems | 2008
Daniel Spelmezan; Jan O. Borchers
We present a wireless prototype system for real-time snowboard training. This system can be used to detect common mistakes during snowboarding and to give students immediate feedback on how to correct their mistakes. The project illustrates new ways to assist students during sports training and to enhance their learning experience on the slope.
human factors in computing systems | 2009
Alexander Hoffmann; Daniel Spelmezan; Jan O. Borchers
TypeRight is a new tactile input device for text entry. It combines the advantages of tactile feedback with error prevention methods of word processors. TypeRight extends the standard keyboard so that the resistance to press each key becomes dynamically adjustable through software. Before each keystroke, the resistance of keys that would lead to a typing error according to dictionary and grammar rules is increased momentarily to make them harder to press, thus avoiding typing errors rather than indicating them after the fact. Two user studies showed that TypeRight decreases error correction rates by an average of 46%.
international conference on body area networks | 2009
Daniel Spelmezan; Adalbert Schanowski; Jan O. Borchers
With this work we want to illustrate new ways to assist students during sports training and to enhance their learning experience. As an example, we present the design of a wearable system intended for snowboard training on the slope. The hardware platform consists of a custom-built sensor/actuator box and a mobile phone acting as host device. These devices run algorithms for activity, context, and mistake recognition, and trigger feedback in response to classification results. Instructors can use such a system to automatically supervise posture and motion of students, i.e., to detect common mistakes that are difficult to recognize when observing students from far away, and to provide immediate audible or tactile feedback for corrections during courses. The presented approach can further be applied to supervise posture and to alert users to potentially harmful body movements performed during daily physical activities.
human computer interaction with mobile devices and services | 2009
Daniel Spelmezan; Anke Hilgers; Jan O. Borchers
Tactile motion instructions are vibrotactile feedback patterns delivered across the entire body that indicate how to move during physical activities. This work investigates the perception and identification of such patterns, based on two different metaphors, under stationary and active situations. We further combine and sequentially trigger different patterns to explore whether tactile motion instructions are understandable as a simple language. A tactile language could represent motion sequences to guide students during demanding exercises. Finally, the presented studies provide insights into perception and interpretation of tactile feedback and help to inform a design space for full-body vibrotactile cues.
human factors in computing systems | 2017
Daniel Spelmezan; Deepak Ranjan Sahoo; Sriram Subramanian
Many finger sensing input devices now support proximity input, enabling users to perform in-air gestures. While near-surface interactions increase the input vocabulary, they lack tactile feedback, making it hard for users to perform gestures or to know when the interaction takes place. Sparkle stimulates the fingertip with touchable electric arcs above a hover sensing device to give users in-air tactile or thermal feedback, sharper and more feelable than acoustic mid-air haptic devices. We present the design of a high voltage resonant transformer with a low-loss soft ferrite core and self-tuning driver circuit, with which we create electric arcs 6 mm in length, and combine this technology with infrared proximity sensing in two proof-of-concept devices with form factor and functionality similar to a button and a touchpad. We provide design guidelines for Sparkle devices and examples of stimuli in application scenarios, and report the results of a user study on the perceived sensations. Sparkle is the first step towards providing a new type of hover feedback, and it does not require users to wear tactile stimulators.
user interface software and technology | 2016
Daniel Spelmezan; Deepak Ranjan Sahoo; Sriram Subramanian
We demonstrate a method for stimulating the fingertip with touchable electric arcs above a hover sensing input device. We built a hardware platform using a high-voltage resonant transformer for which we control the electric discharge to create in-air haptic feedback up to 4 mm in height, and combined this technology with infrared proximity sensing. Our method is a first step towards supporting novel in-air haptic experiences for hover input that does not require the user to wear haptic feedback stimulators.
international symposium on wearable computers | 2008
Daniel Spelmezan; Adalbert Schanowski; Jan O. Borchers
We present tools for prototyping and for testing wearable computing applications. The hardware platform consists of a mobile phone and a custom-built box, which can be equipped at runtime with different sensors and actuators. Software libraries for signal processing and classification complement the toolkit. Users without expertise in electronics or in signal processing can quickly create fully functional wearable prototypes that sense human motion and trigger tactile feedback as response to specific postures in real-time.
ieee haptics symposium | 2016
Daniel Spelmezan; Rafael Morales Gonzalez; Sriram Subramanian
Recent developments in on-body interfaces have extended the interaction space of physical devices to the skin of our hands. While these interfaces can easily project graphical elements on the bare hand, they cannot give tactile feedback. Here we present a technology that could help to expand the output capability of on-body interfaces to provide tactile feedback without restricting the skin as an interaction surface. SkinHaptics works by focusing ultrasound in the hand using a phased array of ultrasound transmitters and the acoustic time-reversal signal processing technique. We present experimental results that show that this device can steer and focus ultrasound on the skin through the hand, which provides the basis for the envisioned technology. We then present results of a study that show that the focused energy can create sensations that are perceived under the skin and in the hand. We demonstrate the potential of SkinHaptics and discuss how our proof-of-concept device can be scaled beyond the prototype.
human factors in computing systems | 2016
Philipp Wacker; Chatchavan Wacharamanotham; Daniel Spelmezan; Jan Thar; David A Sánchez; René Bohne; Jan O. Borchers
Today, persons with a visual impairment use a cane to explore their surroundings and sense objects in their vicinity. While electronic aids have been proposed to aid them, they communicate limited information or require a fixed position. We propose VibroVision, a vest that projects information about the area in front of the wearer onto her abdomen in the form of a two-dimensional tactile image rendered by an array of vibration motors. This vest enables the user to sense features such as shape, position, and distance of objects in front of her.