Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jean-Rémy Chardonnet is active.

Publication


Featured researches published by Jean-Rémy Chardonnet.


International Journal of Human-computer Interaction | 2014

Automatic Stress Classification With Pupil Diameter Analysis

Marco Pedrotti; Mohammad Ali Mirzaei; Adrien Tedesco; Jean-Rémy Chardonnet; Frédéric Merienne; Simone Benedetto; Thierry Baccino

This article proposes a method based on wavelet transform and neural networks for relating pupillary behavior to psychological stress. The proposed method was tested by recording pupil diameter and electrodermal activity during a simulated driving task. Self-report measures were also collected. Participants performed a baseline run with the driving task only, followed by three stress runs where they were required to perform the driving task along with sound alerts, the presence of two human evaluators, and both. Self-reports and pupil diameter successfully indexed stress manipulation, and significant correlations were found between these measures. However, electrodermal activity did not vary accordingly. After training, the four-way parallel neural network classifier could guess whether a given unknown pupil diameter signal came from one of the four experimental trials with 79.2% precision. The present study shows that pupil diameter signal has good discriminating power for stress detection.


robotics and biomimetics | 2006

Dynamic simulator for humanoids using constraint-based method with static friction

Jean-Rémy Chardonnet; Sylvain Miossec; Abderrahmane Kheddar; Hitoshi Arisumi; Hirohisa Hirukawa; François Pierrot; Kazuhito Yokoi

A dynamic simulator using constraint-based method is proposed. It is the extension of the formalism previously introduced by Ruspini and Khatib by including static and dynamic friction without friction cone discretization. The main contribution of the paper is in efficiently combining the operational space formulation of the multi-body dynamics in the contact space and solving for contact forces, including friction, using an iterative Gauss-Seidel approach. Comparing to existing work in this domain, we illustrate our method with scenarios involving humanoid in manipulation tasks while contacting with the environment; an experiment validates our results. Technical details that allow an efficient implementation and problems with future orientation to improve the simulator are also discussed. This work is aiming to be a potential module of the next OpenHRP simulator generation.


ieee-ras international conference on humanoid robots | 2008

Study of an external passive shock-absorbing mechanism for walking robots

Anthony David; Jean-Rémy Chardonnet; Abderrahmane Kheddar; Kenji Kaneko; Kazuhito Yokoi

This paper proposes a compliant sole as an external shock-absorbing mechanism and investigates its effect comparatively to an ankle-located joint-flexible mechanism. The proposed mechanism is mounted under the HRP-2 humanoid feet only using simulation. The comparative evaluation has been conducted for contact resulting from walking using the HRP-2 embedded pattern-generator. The characteristics of the sole material, Young and Poisson coefficients, are set following an ad-hoc minimization of their influence on the vertical acceleration and lateral inclination. Preliminary results suggest that the solution proposed is worth to be considered further and to be developed for real application use.


international conference spatial cognition | 2014

Virtual Distance Estimation in a CAVE

William Eric Marsh; Jean-Rémy Chardonnet; Frédéric Merienne

Past studies have shown consistent underestimation of distances in virtual reality, though the exact causes remain unclear. Many virtual distance cues have been investigated, but past work has failed to account for the possible addition of cues from the physical environment. We describe two studies that assess users’ performance and strategies when judging horizontal and vertical distances in a CAVE. Results indicate that users attempt to leverage cues from the physical environment when available and, if allowed, use a locomotion interface to move the virtual viewpoint to facilitate this.


virtual reality international conference | 2014

Navigation and interaction in a real-scale digital mock-up using natural language and user gesture

Mohammad Ali Mirzaei; Jean-Rémy Chardonnet; Frédéric Merienne; Ariane Genty

This paper tries to demonstrate a very new real-scale 3D system and sum up some firsthand and cutting edge results concerning multi-modal navigation and interaction interfaces. This work is part of the CALLISTO-SARI collaborative project. It aims at constructing an immersive room, developing a set of software tools and some navigation/interaction interfaces. Two sets of interfaces will be introduced here: 1) interaction devices, 2) natural language (speech processing) and user gesture. The survey on this system using subjective observation (Simulator Sickness Questionnaire, SSQ) and objective measurements (Center of Gravity, COG) shows that using natural languages and gesture-based interfaces induced less cyber-sickness comparing to device-based interfaces. Therefore, gesture-based is more efficient than device-based interfaces.


DETC2010 - ASME 2010 International Design Engineering Technical Conferences & Computers and Information in Engineering Conference | 2010

DESIGN OF AN IMMERSIVE PERIPHERAL FOR OBJECT GRASPING

Jean-Rémy Chardonnet; Jean-Claude Léon; Domaine Universitaire Bp

Du ring product development processes, simulations involving user’s grasping operations are of increasing interest to incorporate more quantitative information in DFA (Design For Assembly) or immersive simulations. We present several prototypes of an immersive peripheral device for controlling a virtual hand with fine dexterity. These prototypes are derived from the analysis of a grasping action to define the structure and main features of this device. The prototypes, as easy to manipulate as a computer mouse, enable the simultaneous control of a large number of degrees of freedom (dofs). The design issues, where physical phenomena, physiological behavior and device structure are all tightly combined and significantly influence the overall interaction, are reviewed. These issues include the generation of dofs, monitoring kinematics, force reduction during virtual hand and finger movements, and the influence of device design, sensor types and their placement on the interaction and on the range of configurations that can be achieved for grasping tasks, dexterity, and performance. Examples of grasping tasks show the effect of these immersive devices to reach user-friendly and efficient interactions with objects bringing new insight to the interaction with virtual products. Figure 1: Example of manipulation tasks that can be achieved with the HandNavigator, an immersive peripheral device for grasping virtual objects. Several prototypes using different sensors’ technologies are presented.


International Journal of Human-computer Interaction | 2017

Features of the Postural Sway Signal as Indicators to Estimate and Predict Visually Induced Motion Sickness in Virtual Reality

Jean-Rémy Chardonnet; Mohammad Ali Mirzaei; Frédéric Merienne

ABSTRACT Navigation in a 3D immersive virtual environment is known to be prone to visually induced motion sickness (VIMS). Several psychophysiological and behavioral methods have been used to measure the level of sickness of a user, among which is postural instability. This study investigates all the features that can be extracted from the body postural sway: area of the projection of the center of gravity (mainly considered in past studies) and its shape and the frequency components of the signal’s spectrum, in order to estimate and predict the occurrence of sickness in a typical virtual reality (VR) application. After modeling and simulation of the body postural sway, an experiment on 17 subjects identified a relation between the level of sickness and the variation both in the time and frequency domains of the body sway signal. The results support and go further into detail of findings of past studies using postural instability as an efficient indicator of sickness, giving insight to better monitor VIMS in a VR application.


international conference on artificial reality and telexistence | 2015

Visually induced motion sickness estimation and prediction in virtual reality using frequency components analysis of postural sway signal

Jean-Rémy Chardonnet; Mohammad Ali Mirzaei; Frédéric Merienne

The paper proposes a method for estimating and predicting visually induced motion sickness (VIMS) occurring in a navigation task in a 3D immersive virtual environment, by extracting features from the body postural sway signals in both the time and frequency domains. Past research showed that the change in the body postural sway may be an element for characterizing VIMS. Therefore, we conducted experiments in a 3D virtual environment where the task was simply a translational movement with different navigation speeds. By measuring the evolution of the bodys center of gravity (COG), the analysis of the sway signals in the time domain showed a dilation of the COGs area, as well as a change in the shape of the area. Frequency Components Analysis (FCA) of the sway signal gave an efficient feature to estimate and predict the level of VIMS. The results provide promising insight to better monitor sickness in a virtual reality application.


Proceedings of SPIE | 2013

Nomad devices for interactions in immersive virtual environments

Paul George; Andras Kemeny; Frédéric Merienne; Jean-Rémy Chardonnet; Indira Thouvenin; Javier Posselt; Emmanuel Icart

Renault is currently setting up a new CAVE™, a 5 rear-projected wall virtual reality room with a combined 3D resolution of 100 Mpixels, distributed over sixteen 4k projectors and two 2k projector as well as an additional 3D HD collaborative powerwall. Renault’s CAVE™ aims at answering needs of the various vehicle conception steps [1]. Starting from vehicle Design, through the subsequent Engineering steps, Ergonomic evaluation and perceived quality control, Renault has built up a list of use-cases and carried out an early software evaluation in the four sided CAVE™ of Institute Image, called MOVE. One goal of the project is to study interactions in a CAVE™, especially with nomad devices such as IPhone or IPad to manipulate virtual objects and to develop visualization possibilities. Inspired by nomad devices current uses (multi-touch gestures, IPhone UI look’n’feel and AR applications), we have implemented an early feature set taking advantage of these popular input devices. In this paper, we present its performance through measurement data collected in our test platform, a 4-sided homemade low-cost virtual reality room, powered by ultra-short-range and standard HD home projectors.


advanced video and signal based surveillance | 2013

Improvement of the real-time gesture analysis by a new mother wavelet and the application for the navigation inside a scale-one 3D system

M. Ali Mirzaei; Jean-Rémy Chardonnet; Frédéric Merienne; Christian Pere

This paper proposes a navigation technique for traveling inside a real-scale 3D model based on human gesture analysis. In the first step, a simple threshold is used as a criterion to analyze gestures. In the next step, a complex criterion will be imposed to the analysis to improve the navigation technique. Walking is a periodic signal and moving feet up and down is a part which is repeated. A mother wavelet is allocated to the selected pattern. Then the position of the pattern is recognized by applying Multi Resolution Analysis (MRAs). The movement command is generated and sent to the graphic render (as a movement command). Analytical results show a very high precision performance in the presence of noise, scale variation and superposition of other signals. Practical experiments also verify the same promising results.

Collaboration


Dive into the Jean-Rémy Chardonnet's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Christian Pere

Arts et Métiers ParisTech

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

M. Ali Mirzaei

Arts et Métiers ParisTech

View shared research outputs
Top Co-Authors

Avatar

Jeremy Plouzeau

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar

José Dorado

University of the Andes

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge