Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Christian Jacquemin is active.

Publication


Featured researches published by Christian Jacquemin.


international conference on human computer interaction | 2009

Follow My Finger Navigation

Rami Ajaj; Frédéric Vernier; Christian Jacquemin

This paper presents a novel interaction technique called Follow my Finger (FmF) for navigation in 3D virtual environments using a 2D interactive view on a table-top device. FmF consists in moving a camera icon that represents the 2D subjective position and orientation of a viewpoint in the 3D world. Planar, tactile, and direct manipulation of the camera icon facilitates navigation in the 3D environment. From the users perspective the camera icon follows her/his finger trajectory to interactively modify the horizontal location, inclination, and orientation of the 3D point of view.


tests and proofs | 2013

Cueing multimedia search with audiovisual blur

Tifanie Bouchara; Christian Jacquemin; Brian F. G. Katz

Situated in the context of multimedia browsing, this study concerns perceptual processes involved in searching for an audiovisual object displayed among several distractors. The aim of the study is to increase the perceptual saliency of the target in order to enhance the search process. As blurring distractors and maintaining the target sharp has proved to be a great facilitator of visual search, we propose combining visual blur with an audio blur analogue to improve multimodal search. Three perceptual experiments were performed in which participants had to retrieve an audiovisual object from a set of six competing stimuli. The first two experiments explored the effect of blur level on unimodal search tasks. A third experiment investigated the influence of an audio and visual modality combination with both modalities cued on an audiovisual search task. Results showed that both visual and audio blurs render stimuli distractors less prominent and thus helped users focus on a sharp target more easily. Performances were also faster and more accurate in the bimodal condition than in either unimodal search task, auditory or visual. Our work suggests that audio and audiovisual interfaces dedicated to multimedia search could benefit from different uses of blur on presentation strategies.


european conference on computer vision | 2014

Photometric Compensation to Dynamic Surfaces in a Projector-Camera System

Panagiotis-Alexandros Bokaris; Michèle Gouiffès; Christian Jacquemin; Jean Marc Chomaz

In this paper, a novel approach that allows color compensated projection on an arbitrary surface is presented. Assuming that the geometry of the surface is known, this method can be used in dynamic environments, where the surface color is not static. A simple calibration process is performed offline and only a single input image under reference illumination is sufficient for the estimation of the compensation. The system can recover the reflectance of the surface pixel-wise and provide an accurate photometric compensation to minimize the visibility of the projection surface. The color matching between the desired appearance of the projected image and the projection on the surface is performed in the device-independent color space CIE 1931 XYZ. The results of the evaluation confirm that this method provides a robust and accurate compensation even for surfaces with saturated colors and high spatial frequency patterns. This promising method can be the cornerstone of a real time projector-camera system for dynamic scenes.


International Journal of Creative Interfaces and Computer Graphics | 2010

Organ Augmented Reality: Audio-Graphical Augmentation of a Classical Instrument

Christian Jacquemin; Rami Ajaj; Sylvain Le Beux; Christophe d'Alessandro; Markus Noisternig; Brian F. G. Katz; Bertrand Planes

This paper discusses the Organ Augmented Reality ORA project, which considers an audio and visual augmentation of an historical church organ to enhance the understanding and perception of the instrument through intuitive and familiar mappings and outputs. ORA has been presented to public audiences at two immersive concerts. The visual part of the installation was based on a spectral analysis of the music. The visuals were projections of LED-bar VU-meters on the organ pipes. The audio part was an immersive periphonic sound field, created from the live capture of the organ sounds, so that the listeners had the impression of being inside the augmented instrument. The graphical architecture of the installation is based on acoustic analysis, mapping from sound levels to synchronous graphics through visual calibration, real-time multi-layer graphical composition and animation. The ORA project is a new approach to musical instrument augmentation that combines enhanced instrument legibility and enhanced artistic content.


international conference on image processing | 2015

One-frame delay for dynamic photometric compensation in a projector-camera system.

Panagiotis-Alexandros Bokaris; Michèle Gouiffès; Christian Jacquemin; Jean-Marc Chomaz; Alain Trémeau

One of the main challenges in a projector-camera system is to be able to project on an arbitrary surface and compensate color changes due to the color of the projection surface. This task becomes dramatically more difficult in case the projection surface changes dynamically. Even a slight displacement of the surface breaks the photometric compensation and visible artifacts may occur. An approach that is able to compensate for a dynamic scene is of great importance since it expands the application range of such a system. In this paper, a method that requires only a single frame in order to adapt its photometric compensation to the new surface is demonstrated. Furthermore, it is compared with another state-of-the-art method that claims one-frame delay compensation. In addition, the complex characterization of a DLP projector is addressed. The results prove that not only our approach outperforms the pre-existing method but also it is robust against challenging surfaces with sharp and saturated color patches, as the ones in a real environment.


Proceedings of the 2012 Conference on Ergonomie et Interaction homme-machine | 2012

Guidage attentionnel à base de flou audiovisuel pour la conception d'interfaces multimodales

Tifanie Bouchara; Brian F. G. Katz; Christian Jacquemin

To improve presentation strategies in multimedia browsing systems, perceptual processes involved in the search of a specific audiovisual object displayed among several distractors were investigated. The aim was to enhance the perceptual salience of the target to facilitate and speed up the search process. As visual blur was previously shown as an effective means for user guidance towards a sharp item in a set of blurred distracters, we propose to extend visual blur to audio and audiovisual blurs to improve multimodal search. A perceptual experiment was performed out to evaluate the effect of audio and visual blurs as well as audiovisual blur combinations. Participants had to retrieve an audio-visual target among six audiovisual concurrent objects. Results showed that both visual and audio blurs rendered distractors less prominent and helped users to focus on a sharp target. Furthermore, search task response times were even faster and accuracy even greater with a redundant combination of audio and visual blurs.


smart graphics | 2011

Palliating visual artifacts through audio rendering

Hui Ding; Christian Jacquemin

In this paper, we present a pipeline for combining graphical rendering through an impostor-based level of detail (LOD) technique with audio rendering of an environment sound at different LODs. Two experiments were designed to investigate how parameters used to control the impostors and an additional audio modality can impact the visual detection of artifacts produced by the impostor-based LOD rendering technique. Results show that in general, simple stereo sound hardly impact the perception of image artifacts such as graphical discontinuities.


Archive | 2005

A Study of Spatial Cognition in an Immersive Virtual Audio Environment: Comparing Blind and Blindfolded Individuals

Amandine Afonso; Alan Blum; Christian Jacquemin; Michel Denis; Brian F. G. Katz


Archive | 2010

Audio-Visual Renderings for Multimedia Navigation

Tifanie Bouchara; Brian F. G. Katz; Christian Jacquemin; Catherine Guastavino


AVSP | 2007

A 3D Audio-Visual Animated Agent for Expressive Conversational Question Answering

Jean-Claude Martin; Christian Jacquemin; Laurent Pointal; Brian F. G. Katz; Christophe d'Alessandro; Aurélien Max; Matthieu Courgeon

Collaboration


Dive into the Christian Jacquemin's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Amandine Afonso

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar

Christophe d'Alessandro

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar

Matthieu Courgeon

Centre national de la recherche scientifique

View shared research outputs
Researchain Logo
Decentralizing Knowledge