Roman Lissermann
Technische Universität Darmstadt
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Roman Lissermann.
tangible and embedded interaction | 2012
Mohammadreza Khalilbeigi; Roman Lissermann; Wolfgang Kleine; Jürgen Steimle
In this paper, we present a novel device concept that features double-sided displays which can be folded using predefined hinges. The device concept enables users to dynamically alter both size and shape of the display and also to access the backside using fold gestures. We explore the design of such devices by investigating different types and forms of folding. Furthermore, we propose a set of interaction principles and techniques. Following a user-centered design process, we evaluate our device concept in two sessions with low-fidelity and high-fidelity prototypes.
user interface software and technology | 2015
Martin Schmitz; Mohammadreza Khalilbeigi; Matthias Balwierz; Roman Lissermann; Max Mühlhäuser; Jürgen Steimle
3D printing is widely used to physically prototype the look and feel of 3D objects. Interaction possibilities of these prototypes, however, are often limited to mechanical parts or post-assembled electronics. In this paper, we present Capricate, a fabrication pipeline that enables users to easily design and 3D print highly customized objects that feature embedded capacitive multi-touch sensing. The object is printed in a single pass using a commodity multi-material 3D printer. To enable touch input on a wide variety of 3D printable surfaces, we contribute two techniques for designing and printing embedded sensors of custom shape. The fabrication pipeline is technically validated by a series of experiments and practically validated by a set of example applications. They demonstrate the wide applicability of Capricate for interactive objects.
human factors in computing systems | 2011
Mohammadreza Khalilbeigi; Roman Lissermann; Max Mühlhäuser; Jürgen Steimle
We present a device concept and a prototype of a future mobile device. By featuring a rollable display, its display size and its form factor can be dynamically changed. Moreover, we investigate how physical resizing of the display can be used as an input technique for interacting with digital contents and present a set of novel interaction techniques. Evaluation results show that physical resizing of the display can improve the way we interact with digital contents on mobile devices.
human factors in computing systems | 2014
Roman Lissermann; Jochen Huber; Martin Schmitz; Jürgen Steimle; Max Mühlhäuser
We contribute Permulin, an integrated set of interaction and visualization techniques for multi-view tabletops to support co-located collaboration across a wide variety of collaborative coupling styles. These techniques (1) provide support both for group work and for individual work, as well as for the transitions in-between, (2) contribute sharing and peeking techniques to support mutual awareness and group coordination during phases of individual work, (3) reduce interference during group work on a group view, and (4) directly integrate with conventional multi-touch input. We illustrate our techniques in a proof-of-concept implementation with the two example applications of map navigation and photo collages. Results from two user studies demonstrate that Permulin supports fluent transitions between individual and group work and exhibits unique awareness properties that allow participants to be highly aware of each other during tightly coupled collaboration, while being able to unobtrusively perform individual work during loosely coupled collaboration.
human factors in computing systems | 2013
Roman Lissermann; Jochen Huber; Aristotelis Hadjakos; Max Mühlhäuser
In this work-in-progress paper, we make a case for leveraging the unique affordances of the human ear for eyes-free, mobile interaction. We present EarPut, a novel interface concept, which instruments the ear as an interactive surface for touch-based interactions and its prototypical hardware implementation. The central idea behind EarPut is to go beyond prior work by unobtrusively augmenting a variety of accessories that are worn behind the ear, such as headsets or glasses. Results from a controlled experiment with 27 participants provide empirical evidence that people are able to target salient regions on their ear effectively and precisely. Moreover, we contribute a first, systematically derived interaction design space for ear-based interaction and a set of exemplary applications.
australasian computer-human interaction conference | 2014
Roman Lissermann; Jochen Huber; Aristotelis Hadjakos; Suranga Nanayakkara; Max Mühlhäuser
One of the pervasive challenges in mobile interaction is decreasing the visual demand of interfaces towards eyes-free interaction. In this paper, we focus on the unique affordances of the human ear to support one-handed and eyes-free mobile interaction. We present EarPut, a novel interface concept and hardware prototype, which unobtrusively augments a variety of accessories that are worn behind the ear (e.g. headsets or glasses) to instrument the human ear as an interactive surface. The contribution of this paper is three-fold. We contribute (i) results from a controlled experiment with 27 participants, providing empirical evidence that people are able to target salient regions on their ear effectively and precisely, (ii) a first, systematically derived design space for ear-based interaction and (iii) a set of proof of concept EarPut applications that leverage on the design space and embrace mobile media navigation, mobile gaming and smart home interaction.
human factors in computing systems | 2014
Karolina Buchner; Roman Lissermann; Lars Erik Holmquist
We propose a number of interaction techniques allowing TV viewers to use their mobile phones to view and share content with others in the room, thus supporting local social interaction. Based on a preliminary evaluation, we provide guidelines for designing interactions to support co-located collaborative TV viewing.
international conference on advanced learning technologies | 2010
Jochen Huber; Jürgen Steimle; Simon Olberding; Roman Lissermann; Max Mühlhäuser
Increasingly powerful mobile devices like the Apple iPhone empower learners to watch e-lectures not only at home but also in mobile learning scenarios virtually anywhere and anytime. However, state of the art mobile video browsers do not support learners in getting an overview on and navigating between the large amounts of semantically related e-lectures, which are available in various digital libraries. We contribute a novel user interface for the mobile use of e-lectures. Leveraging a spatial navigation metaphor, it supports both linear and nonlinear interaction within a single lecture, as well as the efficient navigation within large e-lecture libraries. Evaluation results show that our e-lecture browser significantly improves the learning process and leads to significantly higher efficiency and user satisfaction.
human factors in computing systems | 2012
Roman Lissermann; Simon Olberding; Max Mühlhäuser; Jürgen Steimle
Analog paper is still often preferred over electronic documents due to specific affordances and rich spatial interaction, in particular if multiple pages are laid out and handled simultaneously. We investigated how interaction with video can benefit from paper-like displays that support interaction with motion and sound. We present a system that includes novel interaction concepts for both video and audio. This includes spatial techniques for temporal navigation, arranging and grouping of videos, virtualizing and materializing contents, as well as focusing on multiple parallel audio sources.
Informatik Spektrum | 2014
Max Mühlhäuser; Mohammadreza Khalilbeigi; Jan Riemann; Sebastian Döweling; Roman Lissermann
ZusammenfassungTabletops existieren seit längerer Zeit und waren bereits Gegenstand ausführlicher Forschung. Trotzdem sind sie noch immer kein Bestandteil des täglichen Lebens, da sie sich, unserer Meinung nach, nicht gut in die physische Welt integrieren. Wir stellen vier Schritte für eine bessere Integration anhand entsprechender Projekte vor.