Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Pascal Knierim is active.

Publication


Featured researches published by Pascal Knierim.


human factors in computing systems | 2017

Tactile Drones - Providing Immersive Tactile Feedback in Virtual Reality through Quadcopters

Pascal Knierim; Thomas Kosch; Valentin Schwind; Markus Funk; Francisco Kiss; Stefan Schneegass; Niels Henze

Head-mounted displays for virtual reality (VR) provide high-fidelity visual and auditory experiences. Other modalities are currently less supported. Current commercial devices typically deliver tactile feedback through controllers the user holds in the hands. Since both hands get occupied and tactile feedback can only be provided at a single position, research and industry proposed a range of approaches to provide richer tactile feedback. Approaches, such as tactile vests or electrical muscle stimulation, were proposed, but require additional body-worn devices. This limits comfort and restricts provided feedback to specific body parts. With this Interactivity installation, we propose quadcopters to provide tactile stimulation in VR. While the user is visually and acoustically immersed in VR, small quadcopters simulate bumblebees, arrows, and other objects hitting the user. The user wears a VR headset, mini-quadcopters, controlled by an optical marker tracking system, are used to provide tactile feedback.


interactive tabletops and surfaces | 2014

UbiBeam: An Interactive Projector-Camera System for Domestic Deployment

Jan Gugenheimer; Pascal Knierim; Julian Seifert; Enrico Rukzio

Previous research on projector-camera systems has focused for a long time on interaction inside a lab environment. Currently they are no insight on how people would interact and use such a device in their everyday lives. We conducted an in-situ user study by visiting 22 households and exploring specific use cases and ideas of portable projector-camera systems in a domestic environment. Using a grounded theory approach, we identified several categories such as interaction techniques, presentation space, placement and use cases. Based on our observations, we designed and implement UbiBeam, a domestically deployable projector-camera system. The system comprises a projector, a depth camera and two servomotors to transform every ordinary surface into a touch-sensitive information display.


mobile and ubiquitous multimedia | 2017

Investigating drone motion as pedestrian guidance

Ashley Colley; Lasse Virtanen; Pascal Knierim; Jonna Häkkilä

Flying drones have the potential to act as navigation guides for pedestrians, providing more direct guidance than the use of handheld devices. Rather than equipping a drone with a display or indicators, we explore the potential for the drones movements to communicate the route to the walker. For example, should the drone maintain a constant distance a few meters in front of the pedestrian, or should it position itself further along the navigation route, acting as a beacon to walk towards? We created a set of flying drone gestures and evaluated them in an online survey (n = 100) and an in-the-wild user test (n = 10) where participants were guided on a walking route by a flying drone. As a result, we propose an initial set of drone gestures for pedestrian navigation and provide further design recommendations.


international symposium on wearable computers | 2017

Snake view: exploring thermal imaging as a vision extender in mountains

Yomna Abdelrahman; Albrecht Schmidt; Pascal Knierim

Humans vision can only operate in the limited visible band of the electromagnetic spectrum. Using commercially available imaging sensors can be beneficial to extend humans visual perception in different environments. Typically, these environments include challenging conditions, for instance smoky views during a fire or occluded, foggy, cloudy and windy view in mountain environments. Recently, thermal imaging became more commercially available, which makes utilizing it to extend the humans visual perception affordable and deployable. In this paper, we propose the usage of thermal imaging as a vision extension tool. Two initial prototypes are presented depicting the different form factors of thermal cameras attachment to Head Mounted Displays. Finally, we discuss potential use case of extending the humans vision to cover the thermal spectrum during mountains activities.


international conference on human-computer interaction | 2015

UbiBeam: Exploring the Interaction Space for Home Deployed Projector-Camera Systems

Jan Gugenheimer; Pascal Knierim; Christian Winkler; Julian Seifert; Enrico Rukzio

Until now, research on projector-camera systems had only concentrated on user-interaction within a lab-environment. As a result of this, there are very limited insights into how such systems could be used in everyday life. It was therefore our aim to investigate requirements and use cases of home deployed projector-camera systems. To this purpose, we conducted an in-situ user study involving 22 diverse households. Several different categories were specified using a grounded theory approach; placement, projection surface, interaction modality and content/use cases. Based on the analysis of our results, we created UbiBeam; a projector-camera system designed for domestic use. The system has several different features including automatic focus adjustment with depth sensing which enables ordinary surfaces to be transformed into touch-sensitive information displays. We developed UbiBeam as an open source platform and provide construction plans, 3D-models and source code to the community. We encourage researchers to use it as a research platform and conduct more field studies on projector-camera systems.


international symposium on wearable computers | 2017

See through the fire: evaluating the augmentation of visual perception of firefighters using depth and thermal cameras

Yomna Abdelrahman; Pascal Knierim; Pawel W. Wozniak; Niels Henze; Albrecht Schmidt

Our visual perception is limited to the abilities of our eyes, where we only perceive visible light. This limitation might influence how we perceive and react to our surroundings, however, this limitation might endanger us in certain scenarios e.g. firefighting. In this paper, we explore the potential of augmenting the visual sensing of the firefighters using depth and thermal imaging to increase their awareness about the environment. Additionally, we built and evaluated two form factors, hand held and head mounted display. To evaluate our built prototypes, we conducted two user studies in a simulated fire environment with real firefighters. In this workshop paper, we present our findings from the evaluation of the concept and prototypes with real firefighters.


The Physics Teacher | 2017

Augmenting the thermal flux experiment: A mixed reality approach with the HoloLens

M. P. Strzys; S. Kapp; Michael Thees; Jochen Kuhn; Paul Lukowicz; Pascal Knierim; Albrecht Schmidt

In the field of Virtual Reality (VR) and Augmented Reality (AR), technologies have made huge progress during the last years and also reached the field of education. The virtuality continuum, ranging from pure virtuality on one side to the real world on the other, has been successfully covered by the use of immersive technologies like head-mounted displays, which allow one to embed virtual objects into the real surroundings, leading to a Mixed Reality (MR) experience. In such an environment, digital and real objects do not only coexist, but moreover are also able to interact with each other in real time. These concepts can be used to merge human perception of reality with digitally visualized sensor data, thereby making the invisible visible. As a first example, in this paper we introduce alongside the basic idea of this column an MR experiment in thermodynamics for a laboratory course for freshman students in physics or other science and engineering subjects that uses physical data from mobile devices for analyzing and displaying physical phenomena to students.


nordic conference on human-computer interaction | 2016

UbiBeam++: Augmenting Interactive Projection with Head-Mounted Displays

Pascal Knierim; Markus Funk; Thomas Kosch; Anton Fedosov; Tamara Müller; Benjamin Schopf; Marc Weise; Albrecht Schmidt

Interactive tabletops or projections became widely utilized in schools, museum exhibitions or conference rooms to teach and illustrate dynamic artifacts or support talks. In such scenarios, all observers, such as pupils and teachers, will perceive the same information even if they hold different positions and could benefit from an adapted and personalized view. We developed the UbiBeam++ mixed reality software toolkit to enable augmentation of an interactive projection surface using optical see-through glasses. Our toolkit supports simultaneous presentation of private, shared, and public content. Private and shared content is registered in space and presented through a head-mounted display, while public content is presented by a projector. Our toolkit simplifies the development of interactive projections with different visualization levels. In a preliminary study, participants understood the concept of personalized information space and appreciated the presentation of additional information. Looking forward, our toolkit supports the development and the exploration of various scenarios not only limited to teaching, presentations or games.


International Journal of Mobile Human Computer Interaction | 2016

Survey of Interactive Displays through Mobile Projections

Katrin Wolf; Markus Funk; Pascal Knierim; Markus Löchtefeld

Projectors shrink in size, are embedded in some mobile devices, and with the miniaturization of projection technology truly mobile projected displays became possible. In this paper, the authors present a survey of the current state of the art on such displays. They give a holistic overview of current literature and categorize mobile projected displays based on mobility and different possible interaction techniques. This paper tries to aid fellow researchers to identify areas for future work.


mobile and ubiquitous multimedia | 2012

Find my stuff: a search engine for everyday objects

Pascal Knierim; Jens Nickels; Steffen Musiol; Bastian Könings; Florian Schaub; Björn Wiedersheim; Michael Weber

Searching for lost keys, wallets or mobile phones is a common nuisance. Compared to digital information, search support for physical objects is very limited. We propose Find My Stuff (FiMS) as a search engine for physical objects. We built a fully functional Arduino-based prototype. FiMS offers the users a simple search interface to locate tagged physical items in different indoor environments. A hierarchical search process ensures energy efficient and effective searches. Instead of a fixed search infrastructure, the localization system is based on SmartFurniture equipped with RFID readers and ZigBee modules. Search results provide intuitive search cues based on relative positioning to support users in the physical retrieval of their lost objects. The system requires no manual calibration and is robust against rearrangement of SmartFurniture. Safety mechanisms prevent abuse of the system and protect user privacy.

Collaboration


Dive into the Pascal Knierim's collaboration.

Top Co-Authors

Avatar

Markus Funk

University of Stuttgart

View shared research outputs
Top Co-Authors

Avatar

Niels Henze

University of Stuttgart

View shared research outputs
Top Co-Authors

Avatar

Thomas Kosch

University of Stuttgart

View shared research outputs
Top Co-Authors

Avatar

Katrin Wolf

Hamburg University of Applied Sciences

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Sven Mayer

University of Stuttgart

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge