Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Steve Grogorick is active.

Publication


Featured researches published by Steve Grogorick.


acm multimedia | 2015

An Affordable Solution for Binocular Eye Tracking and Calibration in Head-mounted Displays

Michael Stengel; Steve Grogorick; Martin Eisemann; Elmar Eisemann; Marcus A. Magnor

Immersion is the ultimate goal of head-mounted displays (HMD) for Virtual Reality (VR) in order to produce a convincing user experience. Two important aspects in this context are motion sickness, often due to imprecise calibration, and the integration of a reliable eye tracking. We propose an affordable hard- and software solution for drift-free eye-tracking and user-friendly lens calibration within an HMD. The use of dichroic mirrors leads to a lean design that provides the full field-of-view (FOV) while using commodity cameras for eye tracking. Our prototype supports personalizable lens positioning to accommodate for different interocular distances. On the software side, a model-based calibration procedure adjusts the eye tracking system and gaze estimation to varying lens positions. Challenges such as partial occlusions due to the lens holders and eye lids are handled by a novel robust monocular pupil-tracking approach. We present four applications of our work: Gaze map estimation, foveated rendering for depth of field, gaze-contingent level-of-detail, and gaze control of virtual avatars.


eurographics | 2016

Adaptive image-space sampling for gaze-contingent real-time rendering

Michael Stengel; Steve Grogorick; Martin Eisemann; Marcus A. Magnor

With ever‐increasing display resolution for wide field‐of‐view displays—such as head‐mounted displays or 8k projectors—shading has become the major computational cost in rasterization. To reduce computational effort, we propose an algorithm that only shades visible features of the image while cost‐effectively interpolating the remaining features without affecting perceived quality. In contrast to previous approaches we do not only simulate acuity falloff but also introduce a sampling scheme that incorporates multiple aspects of the human visual system: acuity, eye motion, contrast (stemming from geometry, material or lighting properties), and brightness adaptation. Our sampling scheme is incorporated into a deferred shading pipeline to shade the images perceptually relevant fragments while a pull‐push algorithm interpolates the radiance for the rest of the image. Our approach does not impose any restrictions on the performed shading. We conduct a number of psycho‐visual experiments to validate scene‐ and task‐independence of our approach. The number of fragments that need to be shaded is reduced by 50 % to 80 %. Our algorithm scales favorably with increasing resolution and field‐of‐view, rendering it well‐suited for head‐mounted displays and wide‐field‐of‐view projection.


Computer Graphics Forum | 2017

Perception-driven Accelerated Rendering

Martin Weier; Michael Stengel; Thorsten Roth; Piotr Didyk; Elmar Eisemann; Martin Eisemann; Steve Grogorick; André Hinkenjann; Ernst Kruijff; Marcus A. Magnor; Karol Myszkowski; Philipp Slusallek

Advances in computer graphics enable us to create digital images of astonishing complexity and realism. However, processing resources are still a limiting factor. Hence, many costly but desirable aspects of realism are often not accounted for, including global illumination, accurate depth of field and motion blur, spectral effects, etc. especially in real‐time rendering. At the same time, there is a strong trend towards more pixels per display due to larger displays, higher pixel densities or larger fields of view. Further observable trends in current display technology include more bits per pixel (high dynamic range, wider color gamut/fidelity), increasing refresh rates (better motion depiction), and an increasing number of displayed views per pixel (stereo, multi‐view, all the way to holographic or lightfield displays). These developments cause significant unsolved technical challenges due to aspects such as limited compute power and bandwidth. Fortunately, the human visual system has certain limitations, which mean that providing the highest possible visual quality is not always necessary. In this report, we present the key research and models that exploit the limitations of perception to tackle visual quality and workload alike. Moreover, we present the open problems and promising future research targeting the question of how we can minimize the effort to compute and display only the necessary pixels while still offering a user full visual experience.


acm symposium on applied perception | 2017

Subtle gaze guidance for immersive environments

Steve Grogorick; Michael Stengel; Elmar Eisemann; Marcus A. Magnor

Immersive displays allow presentation of rich video content over a wide field of view. We present a method to boost visual importance for a selected - possibly invisible - scene part in a cluttered virtual environment. This desirable feature enables to unobtrusively guide the gaze direction of a user to any location within the immersive 360° surrounding. Our method is based on subtle gaze direction which did not include head rotations in previous work. For covering the full 360° environment and wide field of view, we contribute an approach for dynamic stimulus positioning and shape variation based on eccentricity to compensate for visibility differences across the visual field. Our approach is calibrated in a perceptual study for a head-mounted display with binocular eye tracking. An additional study validates the method within an immersive visual search task.


eurographics | 2014

A Nonobscuring Eye Tracking Solution for Wide Field-of-View Head-mounted Displays

Michael Stengel; Steve Grogorick; Lorenz Rogge; Marcus A. Magnor

W e present a solution for integrating a binocular eye tracker into current state-of-the-art lens-based head-mounted displays (HMDs) without affecting the available field-of-view on the display. Estimating the relative eye gaze of the user opens the door for HMDs to a much wider spectrum of virtual reality applications and games. Further, we present a concept of a low-cost head-mounted display with eye tracking and discuss applications which strongly depend on or benefit from gaze estimation.


tests and proofs | 2016

Simulating Visual Contrast Reduction during Nighttime Glare Situations on Conventional Displays

Benjamin Meyer; Steve Grogorick; Mark Vollrath; Marcus A. Magnor

Bright glare in nighttime situations strongly decreases human contrast perception. Nighttime simulations therefore require a way to realistically depict contrast perception of the user. Due to the limited luminance of popular as well as specialized high-dynamic range displays, physical adaptation of the human eye cannot yet be replicated in a physically correct manner in a simulation environment. To overcome this limitation, we propose a method to emulate the adaptation in nighttime glare situations using a perception-based model. We implemented a postprocessing tone mapping algorithm that simulates the corresponding contrast reduction effect for a night-driving simulation with glares from oncoming vehicles headlights. During glare, tone mapping reduces image contrast in accordance with the incident veiling luminance. As the glare expires, the contrast starts to normalize smoothly over time. The conversion of glare parameters and elapsed time into image contrast during the readaptation phase is based on extensive user studies carried out first in a controlled laboratory setup. Additional user studies have then been conducted in field tests to ensure validity of the derived time-dependent tone-mapping function and to verify transferability onto real-world traffic scenarios.


acm symposium on applied perception | 2018

Comparison of unobtrusive visual guidance methods in an immersive dome environment

Steve Grogorick; Georgia Albuquerque; Jan-Philipp Tauscher; Marcus A. Magnor

In this paper, we evaluate various image-space modulation techniques that aim to unobtrusively guide viewers’ attention. While previous evaluations mainly target desktop settings, we examine their applicability to ultra wide field of view immersive environments, featuring technical characteristics expected for future-generation head-mounted displays. A custom-built, high-resolution immersive dome environment with high-precision eye tracking is used in our experiments. We investigate gaze guidance success rate and unobtrusiveness of five different techniques. Our results show promising guiding performance for four of the tested methods. With regard to unobtrusiveness we find that — while no method remains completely unnoticed — many participants do not report any distractions. The evaluated methods show promise to guide users’ attention also in wide field of virtual environment applications, e.g. virtually guided tours or field operation training.


acm symposium on applied perception | 2018

Analysis of neural correlates of saccadic eye movements

Jan-Philipp Tauscher; Fabian Wolf Schottky; Steve Grogorick; Marcus A. Magnor; Maryam Mustafa

In a concurrent electroencephalography (EEG) and eye-tracking study, we explore the specific neural responses associated with saccadic eye movements. We hypothesise that there is a distinct saccade-related neural response that occurs well before a physical saccade and that this response is different for free, natural saccades versus forced saccades. Our results show a distinct and measurable brain response approximately 200 ms before a physical saccade actually occurs. This response is distinctly different for free saccades versus forced saccades. Our results open up possibilities of predicting saccades based on neural data. This is of particular relevance for creating effective gaze guidance mechanisms within a virtual reality (VR) environment and for creating faster brain computer interfaces (BCI).


ACM Transactions on Applied Perception | 2018

Comparison of Unobtrusive Visual Guidance Methods in an Immersive Dome Environment

Steve Grogorick; Georgia Albuquerque; Jan-Philipp Tauscher; Marcus A. Magnor

In this article, we evaluate various image-space modulation techniques that aim to unobtrusively guide viewers’ attention. While previous evaluations mainly target desktop settings, we examine their applicability to ultrawide field of view immersive environments, featuring technical characteristics expected for future-generation head-mounted displays. A custom-built, high-resolution immersive dome environment with high-precision eye tracking is used in our experiments. We investigate gaze guidance success rates and unobtrusiveness of five different techniques. Our results show promising guiding performance for four of the tested methods. With regard to unobtrusiveness we find that—while no method remains completely unnoticed—many participants do not report any distractions. The evaluated methods show promise to guide users’ attention also in a wide field of virtual environment applications, e.g., virtually guided tours or field operation training.


ieee virtual reality conference | 2015

Non-obscuring binocular eye tracking for wide field-of-view head-mounted-displays

Michael Stengel; Steve Grogorick; Martin Eisemann; Elmar Eisemann; Marcus A. Magnor

We present a complete hardware and software solution for integrating binocular eye tracking into current state-of-the-art lens-based Head-mounted Displays (HMDs) without affecting the users wide field-of-view off the display. The system uses robust and efficient new algorithms for calibration and pupil tracking and allows realtime eye tracking and gaze estimation. Estimating the relative gaze direction of the user opens the door to a much wider spectrum of virtual reality applications and games when using HMDs. We show a 3d-printed prototype of a low-cost HMD with eye tracking that is simple to fabricate and discuss a variety of VR applications utilizing gaze estimation.

Collaboration


Dive into the Steve Grogorick's collaboration.

Top Co-Authors

Avatar

Marcus A. Magnor

Braunschweig University of Technology

View shared research outputs
Top Co-Authors

Avatar

Michael Stengel

Braunschweig University of Technology

View shared research outputs
Top Co-Authors

Avatar

Georgia Albuquerque

Braunschweig University of Technology

View shared research outputs
Top Co-Authors

Avatar

Jan-Philipp Tauscher

Braunschweig University of Technology

View shared research outputs
Top Co-Authors

Avatar

Martin Eisemann

Braunschweig University of Technology

View shared research outputs
Top Co-Authors

Avatar

Elmar Eisemann

Delft University of Technology

View shared research outputs
Top Co-Authors

Avatar

André Hinkenjann

Bonn-Rhein-Sieg University of Applied Sciences

View shared research outputs
Top Co-Authors

Avatar

Benjamin Meyer

Braunschweig University of Technology

View shared research outputs
Top Co-Authors

Avatar

Emmy-Charlotte Förster

Braunschweig University of Technology

View shared research outputs
Top Co-Authors

Avatar

Ernst Kruijff

Bonn-Rhein-Sieg University of Applied Sciences

View shared research outputs
Researchain Logo
Decentralizing Knowledge