Clemens Holzmann
Johannes Kepler University of Linz
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Clemens Holzmann.
international conference on networked sensing systems | 2010
Daniel Roggen; Alberto Calatroni; Mirco Rossi; Thomas Holleczek; Kilian Förster; Gerhard Tröster; Paul Lukowicz; David Bannach; Gerald Pirkl; Alois Ferscha; Jakob Doppler; Clemens Holzmann; Marc Kurz; Gerald Holl; Ricardo Chavarriaga; Hesam Sagha; Hamidreza Bayati; Marco Creatura; José del R. Millán
We deployed 72 sensors of 10 modalities in 15 wireless and wired networked sensor systems in the environment, in objects, and on the body to create a sensor-rich environment for the machine recognition of human activities. We acquired data from 12 subjects performing morning activities, yielding over 25 hours of sensor data. We report the number of activity occurrences observed during post-processing, and estimate that over 13000 and 14000 object and environment interactions occurred. We describe the networked sensor setup and the methodology for data acquisition, synchronization and curation. We report on the challenges and outline lessons learned and best practice for similar large scale deployments of heterogeneous networked sensor systems. We evaluate data acquisition quality for on-body and object integrated wireless sensors; there is less than 2.5% packet loss after tuning. We outline our use of the dataset to develop new sensor network self-organization principles and machine learning techniques for activity recognition in opportunistic sensor configurations. Eventually this dataset will be made public.
world of wireless mobile and multimedia networks | 2009
Daniel Roggen; Kilian Förster; Alberto Calatroni; Thomas Holleczek; Yu Fang; Gerhard Tröster; Alois Ferscha; Clemens Holzmann; Andreas Riener; Paul Lukowicz; Gerald Pirkl; David Bannach; Kai S. Kunze; Ricardo Chavarriaga; José del R. Millán
Opportunistic sensing allows to efficiently collect information about the physical world and the persons behaving in it. This may mainstream human context and activity recognition in wearable and pervasive computing by removing requirements for a specific deployed infrastructure. In this paper we introduce the newly started European research project OPPORTUNITY within which we develop mobile opportunistic activity and context recognition systems. We outline the projects objective, the approach we follow along opportunistic sensing, data processing and interpretation, and autonomous adaptation and evolution to environmental and user changes, and we outline preliminary results.
mobility management and wireless access | 2004
Alois Ferscha; Clemens Holzmann; Stefan Oppl
In this paper, we present an implemented system for supporting group interaction in mobile distributed computing environments. First, an introduction to context computing and a motivation for using contextual information to facilitate group interaction is given. We then present the architecture of our system, which consists of two parts: a subsystem for location sensing that acquires information about the location of users as well as spatial proximities between them, and one for the actual context-aware application, which provides services for group interaction.
advances in mobile multimedia | 2012
Florian Lettner; Clemens Holzmann
The evaluation of mobile user interfaces can be a tedious task, especially if usability tests under real-world conditions should be performed. In particular, the evaluation of high-fidelity prototypes provides valuable measures about the quality of mobile applications, which helps designers to identify potentials of improvement for the next revision. Due to their costs or missing expert knowledge evaluation techniques such as cognitive walkthroughs or heuristic evaluation are often not used. Additionally, commercial frameworks provide insufficient details on usability as they only address commercial statistics regarding user loyalty, in-app purchases or demographics. In this paper, we present a novel approach and toolkit for automated and unsupervised evaluation of mobile applications that, in contrast to existing frameworks, is able to trace any user interaction during the entire lifecycle of an application. As a major novelty, our toolkit can be added to mobile applications without changing application source code, which makes it flexible and scalable for all types of applications. Also, our toolkit is able to identify and visualize design flaws such as navigational errors or efficiency for mobile applications.
ambient intelligence | 2010
Andreas Sippl; Clemens Holzmann; Doris Zachhuber; Alois Ferscha
In this paper, we explore the real-time tracking of human gazes in front of large public displays. The aim of our work is to estimate at which area of a display one ore more people are looking at a time, independently from the distance and angle to the display as well as the height of the tracked people. Gaze tracking is relevant for a variety of purposes, including the automatic recognition of the user’s focus of attention, or the control of interactive applications with gaze gestures. The scope of the present paper is on the former, and we show how gaze tracking can be used for implicit interaction in the pervasive advertising domain. We have developed a prototype for this purpose, which (i) uses an overhead mounted camera to distinguish four gaze areas on a large display, (ii) works for a wide range of positions in front of the display, and (iii) provides an estimation of the currently gazed quarters in real time. A detailed description of the prototype as well as the results of a user study with 12 participants, which show the recognition accuracy for different positions in front of the display, are presented.
international conference on distributed computing systems workshops | 2012
Clemens Holzmann; Matthias Hochgatterer
Computer stereo vision is an important technique for robotic navigation and other mobile scenarios where depth perception is needed, but it usually requires two cameras with a known horizontal displacement. In this paper, we present a solution for mobile devices with just one camera, which is a first step towards making computer stereo vision available to a wide range of devices that are not equipped with stereo cameras. We have built a prototype using a state-of-the-art mobile phone, which has to be manually displaced in order to record images from different lines of sight. Since the displacement between the two images is not known in advance, it is measured using the phones inertial sensors. We evaluated the accuracy of our single-camera approach by performing distance calculations to everyday objects in different indoor and outdoor scenarios, and compared the results with that of a stereo camera phone. As a main advantage of a single moving camera is the possibility to vary its relative position between taking the two pictures, we investigated the effect of different camera displacements on the accuracy of distance measurements.
mobile and ubiquitous multimedia | 2012
Florian Lettner; Clemens Holzmann
Heat maps are an important usability tool for visualising eye gaze data, mouse movement or click interaction on web pages. Regarding mobile applications however, use cases for heat maps are still limited. For example, eye tracking is difficult to realise due to limitations imposed by mobility or hardware requirements. Moreover, heat maps for understanding multi-touch interaction have not been used as a usability tool for mobile applications yet. In this paper we present a concept to generate heat maps for finger-based multi-touch interaction. Our goal is to provide a usability tool for mobile applications, which can be used in studies with a huge number of test participants in real-world environments. As a benefit, developers and designers will be able to comprehend user interaction within mobile applications, which aids in identifying commonly used touch gestures, focus areas and points of interest without the need for additional and bulky hardware.
computer aided systems theory | 2011
Rene Mayrhofer; Clemens Holzmann; Romana Koprivec
In this paper, we propose a new approach to live location sharing among a group of friends that does not rely on a central server as a hub for exchanging locations messages, but is done in a peer-to-peer manner by pushing selective location updates via XMPP messages. That is, only those contacts specifically authorized by users will be notified of location changes via push messages, but no third-party service is involved in processing these locations. We describe an initial implementation called Friends Radar, and discuss the associated trade-offs between privacy and resource consumption.
Pervasive and Mobile Computing | 2010
Clemens Holzmann; Alois Ferscha
With more and more everyday artifacts being equipped with networked embedded systems technology, their spatial situation or context-such as where they are located or whether two of them are near or far apart from each other-is becoming increasingly relevant. The emerging availability of sensor technologies for measuring properties of the physical space enables them to become aware of their spatial context and adapt to changes accordingly, which in turn contributes to the implementation of systems that operate autonomously in the background and interact with humans in a more unobtrusive way. In this article, we specifically address the use of spatial relations between technology-rich artifacts as well as their changes over time. A key aspect is the abstraction of spatial contexts in order to separate details which are not relevant for a certain application, and thereby save computational resources or provide spatial information in a way that is closer to human concepts of space. In this regard, our focus is on qualitatively represented spatial relations, which are used as the basic building blocks for the development of spatially aware applications. A novel software framework is presented for this purpose and evaluated with respect to its performance as well as its adequacy for building real-world applications.
pervasive computing and communications | 2012
Florian Lettner; Clemens Holzmann
Sensing how people interact with their mobile phones under real-world conditions can be an expensive and time-consuming task, especially if data from a large number of users and over a long period of time is needed. However, in order to improve the utility and usability of mobile applications, field studies are often considered to be more suitable than laboratory evaluations. In this paper, we present an innovative approach towards the automatic and transparent sensing of mobile phone usage, which is based on the background observation and analysis of user interactions with a mobile application under real-world conditions. We have implemented a software framework for this purpose, which can easily be added to an Android application, in order to record how users interact with it and transmit the acquired data to a server. We evaluated our framework with regard to its performance and memory consumption on the one hand, and by adding it to an application in the marketplace on the other hand. The results from about 300 users over one week showed that the framework runs stable and with very low resource demands.