Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Thomas Kosch is active.

Publication


Featured researches published by Thomas Kosch.


ubiquitous computing | 2016

Interactive worker assistance: comparing the effects of in-situ projection, head-mounted displays, tablet, and paper instructions

Markus Funk; Thomas Kosch; Albrecht Schmidt

With increasing complexity of assembly tasks and an increasing number of product variants, instruction systems providing cognitive support at the workplace are becoming more important. Different instruction systems for the workplace provide instructions on phones, tablets, and head-mounted displays (HMDs). Recently, many systems using in-situ projection for providing assembly instructions at the workplace have been proposed and became commercially available. Although comprehensive studies comparing HMD and tablet-based systems have been presented, in-situ projection has not been scientifically compared against state-of-the-art approaches yet. In this paper, we aim to close this gap by comparing HMD instructions, tablet instructions, and baseline paper instructions to in-situ projected instructions using an abstract Lego Duplo assembly task. Our results show that assembling parts is significantly faster using in-situ projection and locating positions is significantly slower using HMDs. Further, participants make less errors and have less perceived cognitive load using in-situ instructions compared to HMD instructions.


pervasive technologies related to assistive environments | 2017

The Design Space of Augmented and Virtual Reality Applications for Assistive Environments in Manufacturing: A Visual Approach

Sebastian Büttner; Henrik Mucha; Markus Funk; Thomas Kosch; Mario Aehnelt; Sebastian Robert; Carsten Röcker

Research on how to take advantage of Augmented Reality and Virtual Reality applications and technologies in the domain of manufacturing has brought forward a great number of concepts, prototypes, and working systems. Although comprehensive surveys have taken account of the state of the art, the design space of industrial augmented and virtual reality keeps diversifying. We propose a visual approach towards assessing this space and present an interactive, community-driven tool which supports interested researchers and practitioners in gaining an overview of the aforementioned design space. Using such a framework we collected and classified relevant publications in terms of application areas and technology platforms. This tool shall facilitate initial research activities as well as the identification of research opportunities. Thus, we lay the groundwork, forthcoming workshops and discussions shall address the refinement.


mobile and ubiquitous multimedia | 2015

A benchmark for interactive augmented reality instructions for assembly tasks

Markus Funk; Thomas Kosch; Scott W. Greenwald; Albrecht Schmidt

With the opportunity to customize ordered products, assembly tasks are becoming more and more complex. To meet these increased demands, a variety of interactive instruction systems have been introduced. Although these systems may have a big impact on overall efficiency and cost of the manufacturing process, it has been difficult to optimize them in a scientific way. The challenge is to introduce performance metrics that apply across different tasks and find a uniform experiment design. In this paper, we address this challenge by proposing a standardized experiment design for evaluating interactive instructions and making them comparable with each other. Further, we introduce a General Assembly Task Model, which differentiates between task-dependent and task-independent measures. Through a user study with 12 participants, we evaluate the experiment design and the proposed task model using an abstract pick-and-place task and an artificial industrial task. Finally, we provide paper-based instructions for the proposed task as a baseline for evaluating Augmented Reality instructions.


nordic conference on human-computer interaction | 2016

Investigating Screen Shifting Techniques to Improve One-Handed Smartphone Usage

Huy Viet Le; Patrick Bader; Thomas Kosch; Niels Henze

With increasingly large smartphones, it becomes more difficult to use these devices one-handed. Due to a large touchscreen, users can not reach across the whole screen using their thumb. In this paper, we investigate approaches to move the screen content in order to increase the reachability during one-handed use of large smartphones. In a first study, we compare three approaches based on back-of-device (BoD) interaction to move the screen content. We compare the most preferred BoD approach with direct touch on the front and Apples Reachability feature. We show that direct touch enables faster target selection than the other approaches but does not allow to interact with large parts of the screen. While Reachability is faster compared to a BoD screen shift method, only the BoD approach makes the whole front screen accessible.


human factors in computing systems | 2017

Tactile Drones - Providing Immersive Tactile Feedback in Virtual Reality through Quadcopters

Pascal Knierim; Thomas Kosch; Valentin Schwind; Markus Funk; Francisco Kiss; Stefan Schneegass; Niels Henze

Head-mounted displays for virtual reality (VR) provide high-fidelity visual and auditory experiences. Other modalities are currently less supported. Current commercial devices typically deliver tactile feedback through controllers the user holds in the hands. Since both hands get occupied and tactile feedback can only be provided at a single position, research and industry proposed a range of approaches to provide richer tactile feedback. Approaches, such as tactile vests or electrical muscle stimulation, were proposed, but require additional body-worn devices. This limits comfort and restricts provided feedback to specific body parts. With this Interactivity installation, we propose quadcopters to provide tactile stimulation in VR. While the user is visually and acoustically immersed in VR, small quadcopters simulate bumblebees, arrows, and other objects hitting the user. The user wears a VR headset, mini-quadcopters, controlled by an optical marker tracking system, are used to provide tactile feedback.


pervasive technologies related to assistive environments | 2017

Working with Augmented Reality?: A Long-Term Analysis of In-Situ Instructions at the Assembly Workplace

Markus Funk; Andreas Bächler; Liane Bächler; Thomas Kosch; Thomas Heidenreich; Albrecht Schmidt

Due to increasing complexity of products and the demographic change at manual assembly workplaces, interactive and context-aware instructions for assembling products are becoming more and more important. Over the last years, many systems using head-mounted displays (HMDs) and in-situ projection have been proposed. We are observing a trend in assistive systems using in-situ projection for supporting workers during work tasks. Recent advances in technology enable robust detection of almost every work step, which is done at workplaces. With this improvement in robustness, a continuous usage of assistive systems at the workplace becomes possible. In this work, we provide results of a long-term study in an industrial workplace with an overall runtime of 11 full workdays. In our study, each participant assembled at least three full workdays using in-situ projected instructions. We separately considered two different user groups comprising expert and untrained workers. Our results show a decrease in performance for expert workers and a learning success for untrained workers.


human factors in computing systems | 2018

Your Eyes Tell: Leveraging Smooth Pursuit for Assessing Cognitive Workload

Thomas Kosch; Mariam Hassib; Paweł W. Woźniak; Daniel Buschek; Florian Alt

A common objective for context-aware computing systems is to predict how user interfaces impact user performance regarding their cognitive capabilities. Existing approaches such as questionnaires or pupil dilation measurements either only allow for subjective assessments or are susceptible to environmental influences and user physiology. We address these challenges by exploiting the fact that cognitive workload influences smooth pursuit eye movements. We compared three trajectories and two speeds under different levels of cognitive workload within a user study (N=20). We found higher deviations of gaze points during smooth pursuit eye movements for specific trajectory types at higher cognitive workload levels. Using an SVM classifier, we predict cognitive workload through smooth pursuit with an accuracy of 99.5% for distinguishing between low and high workload as well as an accuracy of 88.1% for estimating workload between three levels of difficulty. We discuss implications and present use cases of how cognition-aware systems benefit from inferring cognitive workload in real-time by smooth pursuit eye movements.


human factors in computing systems | 2018

PalmTouch: Using the Palm as an Additional Input Modality on Commodity Smartphones

Huy Viet Le; Thomas Kosch; Patrick Bader; Sven Mayer; Niels Henze

Touchscreens are the most successful input method for smartphones. Despite their flexibility, touch input is limited to the location of taps and gestures. We present PalmTouch, an additional input modality that differentiates between touches of fingers and the palm. Touching the display with the palm can be a natural gesture since moving the thumb towards the devices top edge implicitly places the palm on the touchscreen. We present different use cases for PalmTouch, including the use as a shortcut and for improving reachability. To evaluate these use cases, we have developed a model that differentiates between finger and palm touch with an accuracy of 99.53% in realistic scenarios. Results of the evaluation show that participants perceive the input modality as intuitive and natural to perform. Moreover, they appreciate PalmTouch as an easy and fast solution to address the reachability issue during one-handed smartphone interaction compared to thumb stretching or grip changes.


tangible and embedded interaction | 2018

Flyables: Exploring 3D Interaction Spaces for Levitating Tangibles

Pascal Knierim; Thomas Kosch; Alexander Achberger; Markus Funk

Recent advances in technology and miniaturization allow the building of self-levitating tangible interfaces. This includes flying tangibles, which extend the mid-air interaction space from 2D to 3D. While a number of theoretical concepts about interaction with levitating tangibles were previously investigated by various researchers, a user-centered evaluation of the presented interaction modalities has attracted only minor attention from prior research. We present Flyables, a system adjusting flying tangibles in 3D space to enable interaction between users and levitating tangibles. Interaction concepts were evaluated in a user study(N=17), using quadcopters as operable levitating tangibles. Three different interaction modalities are evaluated to collect quantitative data and qualitative feedback. Our findings show preferred user interaction modalities using Flyables. We conclude our work with a discussion and future research within the domain of human-drone interaction.


international symposium on wearable computers | 2017

One size does not fit all: challenges of providing interactive worker assistance in industrial settings

Thomas Kosch; Yomna Abdelrahman; Markus Funk; Albrecht Schmidt

Teaching new assembly instructions at manual assembly workplaces has evolved from human supervision to digitized automatic assistance. Assistive systems provide dynamic support, adapt to the user needs, and alleviate perceived workload from expert workers supporting freshman workers. New assembly instructions can be implemented at a fast pace. These assistive systems decrease the cognitive workload of workers as they need to memorize new assembly instructions with each change of product lines. However, the design of assistive systems for the industry is a challenging task. Once deployed, people have to work with such systems for full workdays. From experiences made during our past project motionEAP, we report on design challenges for interactive worker assistance at manual assembly workplaces as well as challenges encountered when deploying interactive assistive systems for diverse user populations.

Collaboration


Dive into the Thomas Kosch's collaboration.

Top Co-Authors

Avatar

Markus Funk

University of Stuttgart

View shared research outputs
Top Co-Authors

Avatar

Niels Henze

University of Regensburg

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Huy Viet Le

University of Stuttgart

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Sven Mayer

University of Stuttgart

View shared research outputs
Top Co-Authors

Avatar

Marc Weise

University of Stuttgart

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Robin Boldt

University of Stuttgart

View shared research outputs
Researchain Logo
Decentralizing Knowledge