Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Eric Enderton is active.

Publication


Featured researches published by Eric Enderton.


eurographics symposium on rendering techniques | 2007

Efficient rendering of human skin

Eugene d'Eon; David Luebke; Eric Enderton

Existing offline techniques for modeling subsurface scattering effects in multi-layered translucent materials such as human skin achieve remarkable realism, but require seconds or minutes to generate an image. We demonstrate rendering of multi-layer skin that achieves similar visual quality but runs orders of magnitude faster. We show that sums of Gaussians provide an accurate approximation of translucent layer diffusion profiles, and use this observation to build a novel skin rendering algorithm based on texture space diffusion and translucent shadow maps. Our technique requires a parameterized model but does not otherwise rely on any precomputed information, and thus extends trivially to animated or deforming models. We achieve about 30 frames per second for realistic real-time rendering of deformable human skin under dynamic lighting.


interactive 3d graphics and games | 2010

Stochastic transparency

Eric Enderton; Erik Sintorn; Peter Shirley; David Luebke

Stochastic transparency provides a unified approach to order-independent transparency, antialiasing, and deep shadow maps. It augments screen-door transparency using a random sub-pixel stipple pattern, where each fragment of transparent geometry covers a random subset of pixel samples of size proportional to alpha. This results in correct alpha-blended colors on average, in a single render pass with fixed memory size and no sorting, but introduces noise. We reduce this noise by an alpha correction pass, and by an accumulation pass that uses a stochastic shadow map from the camera. At the pixel level, the algorithm does not branch and contains no read-modify-write loops, other than traditional z-buffer blend operations. This makes it an excellent match for modern massively parallel GPU hardware. Stochastic transparency is very simple to implement and supports all types of transparent geometry, able without coding for special cases to mix hair, smoke, foliage, windows, and transparent cloth in a single scene.


siggraph eurographics conference on graphics hardware | 2005

GPU-accelerated high-quality hidden surface removal

Daniel Elliott Wexler; Larry I. Gritz; Eric Enderton; Jonathan Rice

High-quality off-line rendering requires many features not natively supported by current commodity graphics hardware: wide smooth filters, high sampling rates, order-independent transparency, spectral opacity, motion blur, depth of field. We present a GPU-based hidden-surface algorithm that implements all these features. The algorithm is Reyes-like but uses regular sampling and multiple passes. Transparency is implemented by depth peeling, made more efficient by opacity thresholding and a new method called z batches. We discuss performance and some design trade-offs. At high spatial sampling rates, our implementation is substantially faster than a CPU-only renderer for typical scenes.


interactive 3d graphics and games | 2011

A local image reconstruction algorithm for stochastic rendering

Peter Shirley; Timo Aila; Jonathan Cohen; Eric Enderton; Samuli Laine; David Luebke; Morgan McGuire

Stochastic renderers produce unbiased but noisy images of scenes that include the advanced camera effects of motion and defocus blur and possibly other effects such as transparency. We present a simple algorithm that selectively adds bias in the form of image space blur to pixels that are unlikely to have high frequency content in the final image. For each pixel, we sweep once through a fixed neighborhood of samples in front to back order, using a simple accumulation scheme. We achieve good quality images with only 16 samples per pixel, making the algorithm potentially practical for interactive stochastic rendering in the near future.


interactive 3d graphics and games | 2011

Colored stochastic shadow maps

Morgan McGuire; Eric Enderton

This paper extends the stochastic transparency algorithm that models partial coverage to also model wavelength-varying transmission. It then applies this to the problem of casting shadows between any combination of opaque, colored transmissive, and partially covered (i.e., α-matted) surfaces in a manner compatible with existing hardware shadow mapping techniques. Colored Stochastic Shadow Maps have a similar resolution and performance profile to traditional shadow maps, however they require a wider filter in colored areas to reduce hue variation.


IEEE Transactions on Visualization and Computer Graphics | 2011

Stochastic Transparency

Eric Enderton; Erik Sintorn; Peter Shirley; David Luebke

Stochastic transparency provides a unified approach to order-independent transparency, antialiasing, and deep shadow maps. It augments screen-door transparency using a random sub-pixel stipple pattern, where each fragment of transparent geometry covers a random subset of pixel samples of size proportional to alpha. This results in correct alpha-blended colors on average, in a single render pass with fixed memory size and no sorting, but introduces noise. We reduce this noise by an alpha correction pass, and by an accumulation pass that uses a stochastic shadow map from the camera. At the pixel level, the algorithm does not branch and contains no read-modify-write loops, other than traditional z-buffer blend operations. This makes it an excellent match for modern massively parallel GPU hardware. Stochastic transparency is very simple to implement and supports all types of transparent geometry, able without coding for special cases to mix hair, smoke, foliage, windows, and transparent cloth in a single scene.


high performance graphics | 2010

Real-time stochastic rasterization on conventional GPU architectures

Morgan McGuire; Eric Enderton; Peter Shirley; David Luebke

This paper presents a hybrid algorithm for rendering approximate motion and defocus blur with precise stochastic visibility evaluation. It demonstrates---for the first time, with a full stochastic technique---real-time performance on conventional GPU architectures for complex scenes at 1920×1080 HD resolution. The algorithm operates on dynamic triangle meshes for which per-vertex velocity or corresponding vertices from the previous frame are available. It leverages multisample antialiasing (MSAA) and a tight space-time-aperture convex hull to efficiently evaluate visibility independently of shading. For triangles whose motion crosses the camera plane, we present a novel 2D bounding box algorithm that we conjecture is conservative. The sampling algorithm further reduces sample variance within primitives by integrating textures according to ray differentials in time and aperture.


international conference on computer graphics and interactive techniques | 2007

A system for efficient rendering of human skin

Eugene d'Eon; David Luebke; Eric Enderton

Figure 1: We project multi-layer diffusion profiles onto a sumof-Gaussians basis to enable realistic rendering of human skin at 30 frames per second on a modern GPU. From left to right: Albedo (1st) and irradiance (2nd) combine to give subsurface irradiance which is then convolved with each Gaussian basis profile (3rd through 7th) and combined in a final render pass with specular (8th) to produce the final image (9th). Convolutions are performed in off-screen 2D textures but shown here mapped onto the face.


international conference on computer graphics and interactive techniques | 2015

Accumulative anti-aliasing

Eric Enderton; Eric B. Lum; Christian Rouet; Oleg Kuznetsov

Accumulative anti-aliasing (ACAA) is a simple modification of forward-rendered multi-sample anti-aliasing (MSAA). It produces the same image quality but consumes half as much multi-sample framebuffer memory, and reduces both render time and off-chip bandwidth by 20% to 30%. ACAA stores multiple depth samples, computed by a depth-only pre-pass, but stores only one color sample per pixel, which is used to accumulate final color as the sum of shaded fragment colors weighted by visibility. ACAA makes higher sample rates practical, improving image quality.


ieee international conference on high performance computing, data, and analytics | 2010

Real-Time Stochastic Rasterization on Conventional GPU Architectures

Morgan McGuire; Eric Enderton; Peter Shirley; David Patrick Luebke

Collaboration


Dive into the Eric Enderton's collaboration.

Researchain Logo
Decentralizing Knowledge