Roman Vilimek
Siemens
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Roman Vilimek.
International Journal of Human-computer Interaction | 2010
Thorsten O. Zander; Matti Gaertner; Christian Kothe; Roman Vilimek
A Brain–Computer Interface (BCI) provides a new communication channel for severely disabled people who have completely or partially lost control over muscular activity. It is questionable whether a BCI is the best choice for controlling a device if partial muscular activity still is available. For example, gaze-based interfaces can be utilized for people who are still able to control their eye movements. Such interfaces suffer from the lack of a natural degree of freedom for the selection command (e.g., a mouse click). One workaround for this problem is based on so-called dwell times, which easily leads to errors if the users do not pay close attention to where they are looking. We developed a multimodal interface combining eye movements and a BCI to a hybrid BCI, resulting in a robust and intuitive device for touchless interaction. This system especially is capable of dealing with different stimulus complexities.
international conference on universal access in human-computer interaction | 2009
Roman Vilimek; Thorsten O. Zander
Gaze-based interfaces gained increasing importance in multimodal human-computer interaction research with the improvement of tracking technologies over the last few years. The activation of selected objects in most eye-controlled applications is based on dwell times. This interaction technique can easily lead to errors if the users do not pay very close attention to where they are looking. We developed a multimodal interface involving eye movements to determine the object of interest and a Brain-Computer Interface to simulate the mouse click. Experimental results show, that although a combined BCI/eye-gaze interface is somewhat slower it reliably leads to less errors in comparison to standard dwell time eye-gaze interfaces.
international conference on human computer interaction | 2007
Roman Vilimek; Thomas Hempel; Birgit Otto
This paper identifies several factors that were observed as being crucial to the usability of multimodal in-vehicle applications - a multimodal system is not of value in itself. Focusing in particular on the typical combination of manual and voice control, this article describes important boundary conditions and discusses the concept of natural interaction.
international conference on engineering psychology and cognitive ergonomics | 2007
Roman Vilimek; Alf Zimmer
Multimodal interaction can substantially improve human-computer interaction by employing multiple perceptual channels. We report on the development and evaluation of a touchpad with auditory, tactile and visual feedback for in-vehicle applications. In a simulator study, we assessed its suitability for interacting with a menu-based on-board system and investigated the effects of uni-, bi- and trimodal feedback on task and driving performance, workload and visual distraction in comparison to a conventional rotary push-button. In summary our results show that users clearly benefit from additional nonvisual feedback while driving. When using the touchpad with multimodal feedback, our subjects also reached a higher level of performance compared to the rotary push-button.
Archive | 2008
Roman Vilimek
The majority of multimodal systems on research level as well as on product level involve speech input. With speech as an eyes-free hands-free input modality, these systems enable the user to interact more effectively across a large range of tasks and environments. But the advantages of multimodal user interfaces will only show up if they are designed to support the abilities and characteristics of the human users. Thus, it is necessary to integrate research results from cognitive sciences in the development process. This paper discusses several experimental findings that demonstrate this necessity. User-centered design methods and user testing will further improve the usability of multimodal systems. However, compared to voice-only interfaces the design, development and usability testing of multimodal systems are far more complicated. A process model shows how the interplay between the development of system components, user-centered evaluation and the integration of knowledge from cognitive sciences can be organized.
Archive | 2007
Roman Vilimek; Thomas Hempel; Holger Oortmann; Birgit Otto
Archive | 2007
Thomas Hempel; Holger Oortmann; Birgit Otto; Roman Vilimek
Archive | 2007
Thomas Hempel; Roman Vilimek
Archive | 2007
Roman Vilimek; Thomas Hempel; Holger Oortmann; Birgit Otto
Archive | 2006
Thomas Hempel; Holger Oortmann; Birgit Otto; Roman Vilimek