Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Gabriel Taubin is active.

Publication


Featured researches published by Gabriel Taubin.


international conference on computer graphics and interactive techniques | 1995

A signal processing approach to fair surface design

Gabriel Taubin

In this paper we describe a new tool for interactive free-form fair surface design. By generalizing classical discrete Fourier analysis to two-dimensional discrete surface signals – functions defined on polyhedral surfaces of arbitrary topology –, we reduce the problem of surface smoothing, or fairing, to low-pass filtering. We describe a very simple surface signal low-pass filter algorithm that applies to surfaces of arbitrary topology. As opposed to other existing optimization-based fairing methods, which are computationally more expensive, this is a linear time and space complexity algorithm. With this algorithm, fairing very large surfaces, such as those obtained from volumetric medical data, becomes affordable. By combining this algorithm with surface subdivision methods we obtain a very effective fair surface design technique. We then extend the analysis, and modify the algorithm accordingly, to accommodate different types of constraints. Some constraints can be imposed without any modification of the algorithm, while others require the solution of a small associated linear system of equations. In particular, vertex location constraints, vertex normal constraints, and surface normal discontinuities across curves embedded in the surface, can be imposed with this technique. CR


IEEE Transactions on Visualization and Computer Graphics | 1999

The ball-pivoting algorithm for surface reconstruction

Fausto Bernardini; Joshua Mittleman; Holly E. Rushmeier; Cláudio T. Silva; Gabriel Taubin

The Ball-Pivoting Algorithm (BPA) computes a triangle mesh interpolating a given point cloud. Typically, the points are surface samples acquired with multiple range scans of an object. The principle of the BPA is very simple: Three points form a triangle if a ball of a user-specified radius p touches them without containing any other point. Starting with a seed triangle, the ball pivots around an edge (i.e., it revolves around the edge while keeping in contact with the edges endpoints) until it touches another point, forming another triangle. The process continues until all reachable edges have been tried, and then starts from another seed triangle, until all points have been considered. The process can then be repeated with a ball of larger radius to handle uneven sampling densities. We applied the BPA to datasets of millions of points representing actual scans of complex 3D objects. The relatively small amount of memory required by the BPA, its time efficiency, and the quality of the results obtained compare favorably with existing techniques.The Ball-Pivoting Algorithm (BPA) computes a triangle mesh interpolating a given point cloud. Typically, the points are surface samples acquired with multiple range scans of an object. The principl...


ACM Transactions on Graphics | 1998

Geometric compression through topological surgery

Gabriel Taubin; Jarek Rossignac

The abundance and importance of complex 3-D data bases in major industry segments, the affordability of interactive 3-D rendering for office and consumer use, and the exploitation of the Internet to distribute and share 3-D data have intensified the need for an effective 3-D geometric compression technique that would significantly reduce the time required to transmit 3-D models over digital communication channels, and the amount of memory or disk space required to store the models. Because the prevalent representation of 3-D models for graphics purposes is polyhedral and because polyhedral models are in general triangulated for rendering, this article introduces a new compressed representation for complex triangulated models and simple, yet efficient, compression and decompression algorithms. In this scheme, vertex positions are quantized within the desired accuracy, a vertex spanning tree is used to predict the position of each vertex from 2,3, or 4 of its ancestors in the tree, and the correction vectors are entropy encoded. Properties, such as normals, colors, and texture coordinates, are compressed in a similar manner. The connectivity is encoded with no loss of information to an average of less than two bits per triangle. The vertex spanning tree and a small set of jump edges are used to split the model into a simple polygon. A triangle spanning tree and a sequence of marching bits are used to encode the triangulation of the polygon. Our approach improves on Michael Deerings pioneering results by exploiting the geometric coherence of several ancestors in the vertex spanning tree, preserving the connectivity with no loss of information, avoiding vertex repetitions, and using about three fewer bits for the connectivity. However, since decompression requires random access to all vertices, this method must be modified for hardware rendering with limited onboard memory. Finally, we demonstrate implementation results for a variety of VRML models with up to two orders of magnitude compression.


international conference on computer vision | 1995

Estimating the tensor of curvature of a surface from a polyhedral approximation

Gabriel Taubin

Estimating principal curvatures and principal directions of a surface from a polyhedral approximation with a large number of small faces, such as those produced by iso-surface construction algorithms, has become a basic step in many computer vision algorithms, particularly in those targeted at medical applications. We describe a method to estimate the tensor of curvature of a surface at the vertices of a polyhedral approximation. Principal curvatures and principal directions are obtained by computing in closed form the eigenvalues and eigenvectors of certain 3/spl times/3 symmetric matrices defined by integral formulas, and closely related to the matrix representation of the tensor of curvature. The resulting algorithm is linear, both in time and in space, as a function of the number of vertices and faces of the polyhedral surface.<<ETX>>


international conference on computer vision | 1995

Curve and surface smoothing without shrinkage

Gabriel Taubin

For a number of computational purposes, including visualization of scientific data and registration of multimodal medical data, smooth curves must be approximated by polygonal curves, and surfaces by polyhedral surfaces. An inherent problem of these approximation algorithms is that the resulting curves and surfaces appear faceted. Boundary-following and iso-surface construction algorithms are typical examples. To reduce the apparent faceting, smoothing methods are used. In this paper, we introduce a new method for smoothing piecewise linear shapes of arbitrary dimension and topology. This new method is in fact a linear low-pass filter that removes high-curvature variations, and does not produce shrinkage. Its computational complexity is linear in the number of edges or faces of the shape, and the required storage is linear in the number of vertices.<<ETX>>


international conference on computer graphics and interactive techniques | 1998

Progressive forest split compression

Gabriel Taubin; André Guéziec; William P. Horn; Francis Lazarus

In this paper we introduce the Progressive Forest Split (PFS) representation, a new adaptive refinement scheme for storing and transmitting manifold triangular meshes in progressive and highly compressed form. As in the Progressive Mesh (PM) method of Hoppe, a triangular mesh is represented as a low resolution polygonal model followed by a sequence of refinement operations, each one specifying how to add triangles and vertices to the previous level of detail to obtain a new level. The PFS format shares with PM and other refinement schemes the ability to smoothly interpolate between consecutive levels of detail. However, it achieves much higher compression ratios than PM by using a more complex refinement operation which can, at the expense of reduced granularity, be encoded more efficiently. A forest split operation doubling the number n of triangles of a mesh requires a maximum of approximately 3:5n bits to represent the connectivity changes, as opposed to approximately (5 + log2(n))n bits in PM. We describe algorithms to efficiently encode and decode the PFS format. We also show how any surface simplification algorithm based on edge collapses can be modified to convert single resolution triangular meshes to the PFS format. The modifications are simple and only require two additional topological tests on each candidate edge collapse. We show results obtained by applying these modifications to the Variable Tolerance method of Gueziec. CR


Proceedings of the IEEE | 1998

Geometry coding and VRML

Gabriel Taubin; William P. Horn; Francis Lazarus; Jarek Rossignac

The virtual-reality modeling language (VRML) is rapidly becoming the standard file format for transmitting three-dimensional (3-D) virtual worlds across the Internet. Static and dynamic descriptions of 3-D objects, multimedia content, and a variety of hyperlinks can be represented in VRML files. Both VRML browsers and authoring tools for the creations of VRML files are widely available for several different platforms. In this paper, we describe the topologically assisted geometric compression technology included in our proposal for the VRML compressed binary format. This technology produces significant reduction of file sizes and, subsequently, of the time required for transmission of such filed across the Internet. Compression ratios of 50:1 or more are achieved for large models. The proposal also includes a binary encoding to create compact, rapidly parsable binary VRML files. The proposal is currently being evaluated by the Compressed Binary Format Working Group of the VRML consortium as a possible extension of the VRML standard. In the topologically assisted compression scheme, a polyhedron is represented using two interlocking trees: a spanning tree of vertices and a spanning tree of triangles. The connectivity information represented in other compact schemes, such as triangular strips and generalized triangular meshes, can be directly derived from this representation. Connectivity information for large models is compressed with storage requirements approaching one bit per triangle. A variable-length, optionally lossy compression technique is used for vertex positions, normals, colors, and texture coordinates. The format supports all VRML property binding conventions.


european conference on computer vision | 1996

Optimal Surface Smoothing as Filter Design

Gabriel Taubin; Tong Zhang; Gene H. Golub

Smooth surfaces are approximated by polyhedral surfaces for a number of computational purposes. An inherent problem of these approximation algorithms is that the resulting polyhedral surfaces appear faceted. Within a recently introduced signal processing approach to solving this problem [7, 8], surface smoothing corresponds to low-pass filtering. In this paper we look at the filter design problem in more detail. We analyze the stability properties of the low-pass filter described in [7, 8], and show how to minimize its running time. We show that most classical techniques used to design finite impulse response (FIR) digital filters can also be used to design significantly faster surface smoothing filters. Finally, we describe an algorithm to estimate the power spectrum of a signal, and use it to evaluate the performance of the different filter design techniques described in the paper.


IEEE Transactions on Pattern Analysis and Machine Intelligence | 1994

Parameterized families of polynomials for bounded algebraic curve and surface fitting

Gabriel Taubin; Fernando Cukierman; Steven Sullivan; Jean Ponce; David J. Kriegman

Interest in algebraic curves and surfaces of high degree as geometric models or shape descriptors for different model-based computer vision tasks has increased in recent years, and although their properties make them a natural choice for object recognition and positioning applications, algebraic curve and surface fitting algorithms often suffer from instability problems. One of the main reasons for these problems is that, while the data sets are always bounded, the resulting algebraic curves or surfaces are, in most cases, unbounded. In this paper, the authors propose to constrain the polynomials to a family with bounded zero sets, and use only members of this family in the fitting process. For every even number d the authors introduce a new parameterized family of polynomials of degree d whose level sets are always bounded, in particular, its zero sets. This family has the same number of degrees of freedom as a general polynomial of the same degree. Three methods for fitting members of this polynomial family to measured data points are introduced. Experimental results of fitting curves to sets of points in R/sup 2/ and surfaces to sets of points in R/sup 3/ are presented. >


eurographics symposium on rendering techniques | 1997

Appying Shape from Lighting Variation to Bump Map Capture

Holly E. Rushmeier; Gabriel Taubin; André Guéziec

We describe a system for capturing bump maps from a series of images of an object from the same view point, but with varying, known, illumination. Using the illumination information we can reconstruct the surface normals for a variety of, but not all, surface finishes and geometries. The system allows an existing object to be rerendered with new lighting and surface finish without explicitly reconstructing the object geometry.

Collaboration


Dive into the Gabriel Taubin's collaboration.

Researchain Logo
Decentralizing Knowledge