Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Guillaume Gilet is active.

Publication


Featured researches published by Guillaume Gilet.


eurographics | 2005

Point-based rendering of trees

Guillaume Gilet; Alexandre Meyer; Fabrice Neyret

The goal of this paper is the interactive and realistic rendering of 3D trees covering a landscape. The landscape is composed by instantiating one or more block of vegetation on the terrain. A block of vegetation is composed by a single or a compact group of trees. For these blocks of vegetation, we propose a new representation based on triangle+point primitives organized into a regular spatial structure (grid). This structure is de ned onto easily adapt the level of details (LOD) of each subpart (cell) of the vegetation element. During the rendering process, we determine a global level of details for each block of vegetation. Then, we re ne it for each cell according to the following heuristic: leaves or branches on the rear of tree or inside the forest are statistically less visible than front ones and then can be rendered coarsely. As a result, our method greatly decrease the number of rendered primitives by preserving realism. This allows rendering of large landscape in interactive time, for a camera far away until inside.


international conference on computer graphics and interactive techniques | 2014

Local random-phase noise for procedural texturing

Guillaume Gilet; Basile Sauvage; Kenneth Vanhoey; Jean-Michel Dischler; Djamchid Ghazanfarpour

Local random-phase noise is a noise model for procedural texturing. It is defined on a regular spatial grid by local noises, which are sums of cosines with random phase. Our model is versatile thanks to separate sampling in the spatial and spectral domains. Therefore, it encompasses Gabor noise and noise by Fourier series. A stratified spectral sampling allows for a faithful yet compact and efficient reproduction of an arbitrary power spectrum. Noise by example is therefore obtained faster than state-of-the-art techniques. As a second contribution we address texture by example and generate not only Gaussian patterns but also structured features present in the input. This is achieved by fixing the phase on some part of the spectrum. Generated textures are continuous and non-repetitive. Results show unprecedented framerates and a flexible visual result: users can control with one parameter the blending between noise by example and structured texture synthesis.


eurographics | 2010

An image-based approach for stochastic volumetric and procedural details

Guillaume Gilet; Jean-Michel Dischler

Noisy volumetric details like clouds, grounds, plaster, bark, roughcast, etc. are frequently encountered in nature and bring an important contribution to the realism of outdoor scenes. We introduce a new interactive approach, easing the creation of procedural representations of “stochastic” volumetric details by using a single example photograph. Instead of attempting to reconstruct an accurate geometric representation from the photograph, we use a stochastic multi‐scale approach that fits parameters of a multi‐layered noise‐based 3D deformation model, using a multi‐resolution filter banks error metric. Once computed, visually similar details can be applied to arbitrary objects with a high degree of visual realism, since lighting and parallax effects are naturally taken into account. Our approach is inspired by image‐based techniques. In practice, the user supplies a photograph of an object covered by noisy details, provides a corresponding coarse approximation of the shape of this object as well as an estimated lighting condition (generally a light source direction). Our system then determines the corresponding noise‐based representation as well as some diffuse, ambient, specular and semi‐transparency reflectance parameters. The resulting details are fully procedural and, as such, have the advantage of extreme compactness, while they can be infinitely extended without repetition in order to cover huge surfaces.


Computers & Graphics | 2015

A multiscale model for rain rendering in real-time

Yoann Weber; Vincent Jolivet; Guillaume Gilet; Djamchid Ghazanfarpour

This paper presents a coherent multiscale model for real-time rain rendering which takes into account local and global properties of rainy scenes. Our aim is to simulate visible rain streaks close to the camera as well as the progressive loss of visibility induced by atmospheric phenomena. Our model proposes to correlate the attenuation of visibility, which is due in part to the extinction phenomenon, and the distribution of raindrops in terms of rainfall intensity and camera?s parameters. Furthermore, this method proposes an original rain streaks generation based on spectral analysis and sparse convolution theory. This allows an accurate control of rainfall intensity and streaks appearance, improving the global realism of rainy scenes. Graphical abstractDisplay Omitted HighlightsNatural phenomena.Rain.Physically-based.Real-time rendering.Attenuation.Extinction.


Computer Graphics Forum | 2012

Multi-scale Assemblage for Procedural Texturing

Guillaume Gilet; Jean-Michel Dischler; Djamchid Ghazanfarpour

A procedural pattern generation process, called multi‐scale “assemblage” is introduced. An assemblage is defined as a multi‐scale composition of “multi‐variate” statistical figures, that can be kernel functions for defining noise‐like texture basis functions, or that can be patterns for defining structured procedural textures. This paper presents two main contributions: 1) a new procedural random point distribution function, that, unlike point jittering, allow us to take into account some spatial dependencies among figures and 2) a “multi‐variate” approach that, instead of defining finite sets of constant figures, allows us to generate nearly infinite variations of figures on‐the‐fly. For both, we use a “statistical shape model”, which is a representation of shape variations. Thanks to a direct GPU implementation, assemblage textures can be used to generate new classes of procedural textures for real‐time rendering by preserving all characteristics of usual procedural textures, namely: infinity, definition independency (provided the figures are also definition independent) and extreme compactness.


eurographics | 2016

A phenomenological model for throughfall rendering in real-time

Yoann Weber; Vincent Jolivet; Guillaume Gilet; Kazuki Nanko; Djamchid Ghazanfarpour

This paper aims at rendering interactive visual effects inherent to complex interactions between trees and rain in real‐time in order to increase the realism of natural rainy scenes. Such a complex phenomenon involves a great number of physical processes influenced by various interlinked factors and its rendering represents a thorough challenge in Computer Graphics. We approach this problem by introducing an original method to render drops dripping from leaves after interception of raindrops by foliage. Our method introduces a new hydrological model representing interactions between rain and foliage through a phenomenological approach. Our model reduces the complexity of the phenomenon by representing multiple dripping drops with a new fully functional form evaluated per‐pixel on‐the‐fly and providing improved control over density and physical properties. Furthermore, an efficient real‐time rendering scheme, taking full advantage of latest GPU hardware capabilities, allows the rendering of a large number of dripping drops even for complex scenes.


interactive 3d graphics and games | 2010

Procedural texture particles

Guillaume Gilet; Jean Michel Dischler

We introduce procedural texture particles, a new texture model at mid-way between procedural textures and example-based texture synthesis. As for example-based texture synthesis, we use an input example to produce similar looking textures. But instead of creating texture images (pixel arrays), our textures are defined in the form of procedural distributions of interchangeable visual elements called particles. As for classical example-based synthesis, our method guarantees a certain visual resemblance with the example, but obtained textures are compact and defined on the entire infinite 2D plane.


CGVC '16 Proceedings of the conferece on Computer Graphics & Visual Computing | 2016

Volumetric spot noise for procedural 3D shell texture synthesis

Nicolas Pavie; Guillaume Gilet; Jean-Michel Dischler; Eric Galin; Djamchid Ghazanfarpour

In this paper, we present an extension of the Locally Controlled Spot Noise and a visualization pipeline for volumetric fuzzy details synthesis. We extend the noise model to author volumetric fuzzy details using filtered 3D quadratic kernel functions convolved with a projective non-uniform 2D distribution of impulses. We propose a new method based on order independent splatting to compute a fast view dependent approximation of shell noise at interactive rates. Our method outperforms ray marching techniques and avoids aliasing artifacts, thus improving interactive content authoring feedback. Moreover, generated surface details share the same properties as procedural noise: they extend on potentially infinite surfaces, are defined in an extremely compact way, are non-repetitive, continuous (no discrete voxel-artifacts when zooming) and independent of the definition of the underlying surface (no surface parameterization is required).


Computer Graphics Forum | 2009

A Framework for Interactive Hypertexture Modelling

Guillaume Gilet; Jean-Michel Dischler

Hypertexturing can be a powerful way of adding rich geometric details to surfaces at low memory cost by using a procedural three‐dimensional (3D) space distortion. However, this special kind of texturing technique still raises a major problem: the efficient control of the visual result. In this paper, we introduce a framework for interactive hypertexture modelling. This framework is based on two contributions. First, we propose a reformulation of the density modulation function. Our density modulation is based on the notion of shape transfer function. This function, which can be easily edited by users, allows us to control in an intuitive way the visual appearance of the geometric details resulting from the space distortion. Second, we propose to use a hybrid surface and volume‐point‐based representation in order to be able to dynamically hypertexture arbitrary objects at interactive frame rates. The rendering consists in a combined splat‐ and raycasting‐based direct volume rendering technique. The splats are used to model the volumetric object while raycasting allows us to add the details. An experimental study on users shows that our approach improves the design of hypertextures and yet preserves their procedural nature.


eurographics | 2010

Procedural Descriptions of Anisotropic Noisy Textures by Example

Guillaume Gilet; Jean-Michel Dischler; Luc Soler

Collaboration


Dive into the Guillaume Gilet's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Basile Sauvage

University of Strasbourg

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge