Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where David K. McAllister is active.

Publication


Featured researches published by David K. McAllister.


international conference on computer graphics and interactive techniques | 2000

Relief texture mapping

Manuel M. Oliveira; Gary Bishop; David K. McAllister

We present an extension to texture mapping that supports the representation of 3-D surface details and view motion parallax. The results are correct for viewpoints that are static or moving, far away or nearby. Our approach is very simple: a relief texture (texture extended with an orthogonal displacement per texel) is mapped onto a polygon using a two-step process: First, it is converted into an ordinary texture using a surprisingly simple 1-D forward transform. The resulting texture is then mapped onto the polygon using standard texture mapping. The 1-D warping functions work in texture coordinates to handle the parallax and visibility changes that result from the 3-D shape of the displacement surface. The subsequent texture-mapping operation handles the transformation from texture to screen coordinates.


eurographics | 1999

Real-Time Rendering of Real World Environments

David K. McAllister; Lars S. Nyland; Voicu Popescu; Anselmo Lastra; Chris McCue

One of the most important goals of interactive computer graphics is to allow a user to freely walk around a virtual recreation of a real environment that looks as real as the world around us. But hand-modeling such a virtual environment is inherently limited and acquiring the scene model using devices also presents challenges. Interactively rendering such a detailed model is beyond the limits of current graphics hardware, but image-based approaches can significantly improve the status quo. We present an end-to-end system for acquiring highly detailed scans of large real world spaces, consisting of forty to eighty million range and color samples, using a digital camera and laser rangefinder. We explain successful techniques to represent these large data sets as image-based models and present contributions to image-based rendering that allow these models to be rendered in real time on existing graphics hardware without sacrificing the high resolution at which the data sets were acquired.


Proceedings IEEE Workshop on Multi-View Modeling and Analysis of Visual Scenes (MVIEW'99) | 1999

The impact of dense range data on computer graphics

Lars S. Nyland; David K. McAllister; Voicu Popescu; Chris McCue; Anselmo Lastra; Paul Rademacher; Manuel M. Oliveira; Gary Bishop; Gopi Meenakshisundaram; Matt Cutts; Henry Fuchs

The ability to quickly acquire dense range data of familiar environments has been met with enthusiastic response in our graphics laboratory. We describe our prototype range collection system, based on a scanning laser rangefinder and a high-resolution digital color camera, that allows us to create panoramic color photographs where every pixel has an accurate range value. To accommodate occlusions, the data acquisition process is repeated from multiple locations and the results are registered with software. We discuss the acquisition system and review how the data from this prototype system have been used in existing graphics projects. We speculate about future improvements to range acquisition hardware and how those improvements impact graphics applications. The data acquired from our system is just a hint of what will be available in the future and we conclude with speculation about the impact of such data on graphics hardware and algorithms, since prevailing graphics hardware and software do not support this data well.


Proceedings of SPIE | 2000

Capturing, processing, and rendering real-world scenes

Lars S. Nyland; Anselmo Lastra; David K. McAllister; Voicu Popescu; Chris McCue; Henry Fuchs

While photographs vividly capture a scene from a single viewpoint, it is our goal to capture a scene in such a way that a viewer can freely move to any viewpoint, just as he or she would in an actual scene. We have built a prototype system to quickly digitize a scene using a laser rangefinder and a high-resolution digital camera that accurately captures a panorama of high-resolution range and color information. With real-world scenes, we have provided data to fuel research in many area, including representation, registration, data fusion, polygonization, rendering, simplification, and reillumination. The real-world scene data can be used for many purposes, including immersive environments, immersive training, re-engineering and engineering verification, renovation, crime-scene and accident capture and reconstruction, archaeology and historic preservation, sports and entertainment, surveillance, remote tourism and remote sales. We will describe our acquisition system, the necessary processing to merge data from the multiple input devices and positions. We will also describe high quality rendering using the data we have collected. Issues about specific rendering accelerators and algorithms will also be presented. We will conclude by describing future uses and methods of collection for real- world scene data.


applied imagery pattern recognition workshop | 2000

Interactive exploration of acquired 3D data

Lars S. Nyland; David K. McAllister; Voicu Popescu; Chris McCue; Anselmo Lastra

The goal of our image-based rendering group is to accurately render scenes acquired from the real world. To achieve this goal, we capture scene data by taking 3D panoramic photographs from multiple locations and merge the acquired data into a single model from which real-time 3D rendering can be performed. In this paper, we describe our acquisition hardware and rendering system that seeks to achieve this goal, with particular emphasis on the techniques used to support interactive exploration.


international conference on computer graphics and interactive techniques | 2002

The spatial bi-directional reflectance distribution function

David K. McAllister; Anselmo Lastra; Benjamin P. Cloward; Wolfgang Heidrich

Combining texture mapping with bi-directional reflectance distribution functions (BRDFs) yields a representation of surface appearance with both spatial and angular detail. We call a texture map with a unique BRDF at each pixel a spatial bi-directional reflectance distribution function, or SBRDF. The SBRDF is a six-dimensional function representing the reflectance from each incident direction to each exitant direction at each surface point. Because of the high dimensionality of the SBRDF, previous appearance capture and representation work has focused on either spatial or angular detail, has relied on a small set of basis BRDFs, or has only treated spatial detail statistically [Dana 1999; Lensch 2001].


Archive | 2002

A generalized surface appearance representation for computer graphics

David K. McAllister; Anselmo Lastra


Archive | 2000

The Design of an API for Particle Systems

David K. McAllister


HWWS | 2002

Efficient rendering of spatial bi-directional reflectance distribution functions

David K. McAllister; Anselmo Lastra; Wolfgang Heidrich

Collaboration


Dive into the David K. McAllister's collaboration.

Top Co-Authors

Avatar

Anselmo Lastra

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar

Chris McCue

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar

Lars S. Nyland

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Gary Bishop

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar

Henry Fuchs

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar

Manuel M. Oliveira

Universidade Federal do Rio Grande do Sul

View shared research outputs
Top Co-Authors

Avatar

Gopi Meenakshisundaram

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar

Matt Cutts

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar

Paul Rademacher

University of North Carolina at Chapel Hill

View shared research outputs
Researchain Logo
Decentralizing Knowledge