Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Uwe Gruenefeld is active.

Publication


Featured researches published by Uwe Gruenefeld.


symposium on spatial user interaction | 2017

EyeSee360: designing a visualization technique for out-of-view objects in head-mounted augmented reality

Uwe Gruenefeld; Dag Ennenga; Abdallah El Ali; Wilko Heuten; Susanne Boll

Head-mounted displays allow user to augment reality or dive into a virtual one. However, these 3D spaces often come with problems due to objects that may be out of view. Visualizing these out-of-view objects is useful under certain scenarios, such as situation monitoring during ship docking. To address this, we designed a lo-fi prototype of our EyeSee360 system, and based on user feedback, subsequently implemented EyeSee360. We evaluate our technique against well-known 2D off-screen object visualization techniques (Arrow, Halo, Wedge) adapted for head-mounted Augmented Reality, and found that EyeSee360 results in lowest error for direction estimation of out-of-view objects. Based on our findings, we outline the limitations of our approach and discuss the usefulness of our developed lo-fi prototyping tool.


human computer interaction with mobile devices and services | 2017

Visualizing out-of-view objects in head-mounted augmented reality

Uwe Gruenefeld; Abdallah El Ali; Wilko Heuten; Susanne Boll

Various off-screen visualization techniques that point to off-screen objects have been developed for small screen devices. A similar problem arises with head-mounted Augmented Reality (AR) with respect to the human field-of-view, where objects may be out of view. Being able to detect so-called out-of-view objects is useful for certain scenarios (e.g., situation monitoring during ship docking). To augment existing AR with this capability, we adapted and tested well-known 2D off-screen object visualization techniques (Arrow, Halo, Wedge) for head-mounted AR. We found that Halo resulted in the lowest error for direction estimation while Wedge was subjectively perceived as best. We discuss future directions of how to best visualize out-of-view objects in head-mounted AR.


acm symposium on applied perception | 2017

Effects of location and fade-in time of (audio-)visual cues on response times and success-rates in a dual-task experiment

Andreas Löcken; Sarah Blum; Tim Claudius Stratmann; Uwe Gruenefeld; Wilko Heuten; Susanne Boll; Steven van de Par

While performing multiple competing tasks at the same time, e.g., when driving, assistant systems can be used to create cues to direct attention towards required information. However, poorly designed cues will interrupt or annoy users and affect their performance. Therefore, we aim to identify cues that are not missed and trigger a quick reaction without changing the primary task performance. We conducted a dual-task experiment in an anechoic chamber with LED-based stimuli that faded in or turned on abruptly and were placed in the periphery or front of a subject. Additionally, a white noise sound was triggered in a third of the trials. The primary task was to react to visual stimuli placed on a screen in front. We observed significant effects on the response times in the screen task when adding sound. Further, participants responded faster to LED stimuli when they faded in.


international symposium on pervasive displays | 2018

Exploring Vibrotactile and Peripheral Cues for Spatial Attention Guidance

Tim Claudius Stratmann; Andreas Löcken; Uwe Gruenefeld; Wilko Heuten; Susanne Boll

For decision making in monitoring and control rooms situation awareness is key. Given the often spacious and complex environments, simple alarms are not sufficient for attention guidance (e.g., on ship bridges). In our work, we explore shifting attention towards the location of relevant entities in large cyber-physical systems. Therefore, we used pervasive displays: tactile displays on both upper arms and a peripheral display. With these displays, we investigated shifting the attention in a seated and standing scenario. In a first user study, we evaluated four distinct cue patterns for each on-body display. We tested seated monitoring limited to 90° in front of the user. In a second study, we continued with the two patterns from the first study for lowest and highest urgency perceived. Here, we investigated standing monitoring in a 360° environment. We found that tactile cues led to faster arousal times than visual cues, whereas the attention shift speed for visual cues was faster than tactile cues.


Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications | 2018

EyeMR: low-cost eye-tracking for rapid-prototyping in head-mounted mixed reality

Tim Claudius Stratmann; Uwe Gruenefeld; Susanne Boll

Mixed Reality devices can either augment reality (AR) or create completely virtual realities (VR). Combined with head-mounted devices and eye-tracking, they enable users to interact with these systems in novel ways. However, current eye-tracking systems are expensive and limited in the interaction with virtual content. In this paper, we present EyeMR, a low-cost system (below 100


symposium on spatial user interaction | 2017

EyeSee: beyond reality with Microsoft HoloLens

Uwe Gruenefeld; Dana Hsiao; Wilko Heuten; Susanne Boll

) that enables researchers to rapidly prototype new techniques for eye and gaze interactions. Our system supports mono- and binocular tracking (using Pupil Capture) and includes a Unity framework to support the fast development of new interaction techniques. We argue for the usefulness of EyeMR based on results of a user evaluation with HCI experts.


human computer interaction with mobile devices and services | 2017

PeriMR: a prototyping tool for head-mounted peripheral light displays in mixed reality

Uwe Gruenefeld; Tim Claudius Stratmann; Wilko Heuten; Susanne Boll

Head-mounted Augmented Reality (AR) devices allow overlaying digital information on the real world, where objects may be out of view. Visualizing these out-of-view objects is useful under certain scenarios. To address this, we developed EyeSee360[1] in our previous work. However, our implementation of EyeSee360 was limited to video-see-through devices. These devices suffer from a delayed looped camera image and are decreasing the human field-of-view. In this demo, we present our EyeSee360 transferred to optical-see-through Augmented Reality to overcome these limitations.


Proceedings of the 2014 IEEE/WIC/ACM International Joint Conferences on Web Intelligence (WI) and Intelligent Agent Technologies (IAT) on | 2014

Swarming in the Urban Web Space to Discover the Optimal Region

Chandan Kumar; Uwe Gruenefeld; Wilko Heuten; Susanne Boll

Nowadays, Mixed and Virtual Reality devices suffer from a field of view that is too small compared to human visual perception. Although a larger field of view is useful (e.g., conveying peripheral information or improving situation awareness), technical limitations prevent the extension of the field-of-view. A way to overcome these limitations is to extend the field-of-view with peripheral light displays. However, there are no tools to support the design of peripheral light displays for Mixed or Virtual Reality devices. Therefore, we present our prototyping tool PeriMR that allows researchers to develop new peripheral head-mounted light displays for Mixed and Virtual Reality.


symposium on spatial user interaction | 2018

Identification of Out-of-View Objects in Virtual Reality

Uwe Gruenefeld; Rieke von Bargen; Wilko Heuten

People moving to a new place usually look for a suitable region with respect to their multiple criteria of interests. In this work we map this problem to the migration behavior of other species such as swarming, which is a collective behavior exhibited by animals of similar size which aggregate together, milling about the same region. Taking the swarm intelligence perspective, we present a novel method to find relevant geographic region for citizens based on Particle Swarm Optimization (PSO) framework. Particles represent geographic regions which are moving in the map space to find a region most relevant with respect to users query. The characterization of geographic regions is based on the multi-criteria distribution of geo-located facilities or landscape structure from the Open Street Map data source. We enable end users to visualize and evaluate the regional search process of PSO via a Web interface. The proposed framework demonstrates high precision and computationally efficient performance for regional search over a vast city based dataset.


international symposium on pervasive displays | 2018

EyeSeeX: Visualization of Out-of-View Objects on Small Field-of-View Augmented and Virtual Reality Devices

Uwe Gruenefeld; Dana Hsiao; Wilko Heuten

Current Virtual Reality (VR) devices have limited fields-of-view (FOV). A limited FOV amplifies the problem of objects receding from view. In previous work, different techniques have been proposed to visualize the position of objects out of view. However, these techniques do not allow to identify these objects. In this work, we compare three different ways of identifying out-of-view objects. Our user study shows that participants prefer to have the identification always visible.

Collaboration


Dive into the Uwe Gruenefeld's collaboration.

Top Co-Authors

Avatar

Susanne Boll

University of Oldenburg

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Dag Ennenga

University of Oldenburg

View shared research outputs
Top Co-Authors

Avatar

Daniel Lange

University of Oldenburg

View shared research outputs
Researchain Logo
Decentralizing Knowledge