Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Maud Marchal is active.

Publication


Featured researches published by Maud Marchal.


medical image computing and computer assisted intervention | 2008

Interactive Simulation of Embolization Coils: Modeling and Experimental Validation

Jérémie Dequidt; Maud Marchal; Christian Duriez; Erwan Kerien; Stéphane Cotin

Coil embolization offers a new approach to treat aneurysms. This medical procedure is namely less invasive than an open-surgery as it relies on the deployment of very thin platinum-based wires within the aneurysm through the arteries. When performed intracranially, this procedure must be particularly accurate and therefore carefully planned and performed by experienced radiologists. A simulator of the coil deployment represents an interesting and helpful tool for the physician by providing information on the coil behavior. In this paper, an original modeling is proposed to obtain interactive and accurate simulations of coil deployment. The model takes into account geometric nonlinearities and uses a shape memory formulation to describe its complex geometry. An experimental validation is performed in a contact-free environment to identify the mechanical properties of the coil and to quantitatively compare the simulation with real data. Computational performances are also measured to insure an interactive simulation.


symposium on 3d user interfaces | 2011

Joyman: A human-scale joystick for navigating in virtual worlds

Maud Marchal; Julien Pettré; Anatole Lécuyer

In this paper, we propose a novel interface called Joyman, designed for immersive locomotion in virtual environments. Whereas many previous interfaces preserve or stimulate the users proprioception, the Joyman aims at preserving equilibrioception in order to improve the feeling of immersion during virtual locomotion tasks. The proposed interface is based on the metaphor of a human-scale joystick. The device has a simple mechanical design that allows a user to indicate his virtual navigation intentions by leaning accordingly. We also propose a control law inspired by the biomechanics of the human locomotion to transform the measured leaning angle into a walking direction and speed - i.e., a virtual velocity vector. A preliminary evaluation was conducted in order to evaluate the advantages and drawbacks of the proposed interface and to better draw the future expectations of such a device.


medical image computing and computer assisted intervention | 2009

Interactive Simulation of Flexible Needle Insertions Based on Constraint Models

Christian Duriez; Christophe Guébert; Maud Marchal; Stéphane Cotin; Laurent Grisoni

This paper presents a new modeling method for the insertion of needles and more generally thin and flexible medical devices into soft tissues. Several medical procedures rely on the insertion of slender medical devices such as biopsy, brachytherapy, deep-brain stimulation. In this paper, the interactions between soft tissues and flexible instruments are reproduced using a set of dedicated complementarity constraints. Each constraint is positionned and applied to the deformable models without requiring any remeshing. Our method allows for the 3D simulation of different physical phenomena such as puncture, cutting, static and dynamic friction at interactive frame rate. To obtain realistic simulation, the model can be parametrized using experimental data. Our method is validated through a series of typical simulation examples and new more complex scenarios.


IEEE Transactions on Visualization and Computer Graphics | 2013

Real-Time Simulation of Brittle Fracture Using Modal Analysis

Loeiz Glondu; Maud Marchal; Georges Dumont

We present a novel physically based approach for simulating realistic brittle fracture of impacting bodies in real time. Our method is mainly composed of two novel parts: 1) a fracture initiation method based on modal analysis, and 2) a fast energy-based fracture propagation algorithm. We propose a way to compute the contact durations and the contact forces between stiff bodies to simulate the damped deformation wave that is responsible for fracture initiation. As a consequence, our method naturally takes into account the damping properties of the bodies as well as the contact properties to simulate the fracture. To obtain a complete fracture pipeline, we present an efficient way to generate the fragments and their geometric surfaces. These surfaces are sampled on the edges of the physical mesh, to visually represent the actual fracture surface computed. As shown in our results, the computation time performances and realism of our method are well suited for physically based interactive applications.


IEEE Transactions on Visualization and Computer Graphics | 2011

Six Degrees-of-Freedom Haptic Interaction with Fluids

Gabriel Cirio; Maud Marchal; Sébastien Hillaire; Anatole Lécuyer

We often interact with fluids in our daily life, either through tools such as when holding a glass of water or directly with our body when we swim or we wash our hands. Multimodal interactions with virtual fluids would greatly improve the simulations realism, particularly through haptic interaction. However, achieving realistic, stable, and real-time force feedback from fluids is particularly challenging. In this work, we propose a novel approach that allows real-time six Degrees of Freedom (DoF) haptic interaction with fluids of variable viscosity. Our haptic rendering technique, based on a Smoothed-Particle Hydrodynamics physical model, provides a realistic haptic feedback through physically based forces. 6DoF haptic interaction with fluids is made possible thanks to a new coupling scheme and a unified particle model, allowing the use of arbitrary-shaped rigid bodies. Particularly, fluid containers can be created to hold fluid and hence transmit to the user force feedback coming from fluid stirring, pouring, shaking, and scooping, to name a few. Moreover, we adapted an existing visual rendering algorithm to meet the frame rate requirements of the haptic algorithms. We evaluate and illustrate the main features of our approach through different scenarios, highlighting the 6DoF haptic feedback and the use of containers.


symposium on 3d user interfaces | 2014

The Virtual Mitten: A novel interaction paradigm for visuo-haptic manipulation of objects using grip force

Merwan Achibet; Maud Marchal; Ferran Argelaguet; Anatole Lécuyer

In this paper, we propose a novel visuo-haptic interaction paradigm called the “Virtual Mitten” for simulating the 3D manipulation of objects. Our approach introduces an elastic handheld device that provides a passive haptic feedback through the fingers and a mitten interaction metaphor that enables to grasp and manipulate objects. The grasping performed by the mitten is directly correlated with the grip force applied on the elastic device and a supplementary pseudo-haptic feedback modulates the visual feedback of the interaction in order to simulate different haptic perceptions. The Virtual Mitten allows natural interaction and grants users with an extended freedom of movement compared with rigid devices with limited workspaces. Our approach has been evaluated within two experiments focusing both on subjective appreciation and perception. Our results show that participants were able to well perceive different levels of effort during basic manipulation tasks thanks to our pseudo-haptic approach. They could also rapidly appreciate how to achieve different actions with the Virtual Mitten such as opening a drawer or pulling a lever. Taken together, our results suggest that our novel interaction paradigm could be used in a wide range of applications involving one or two-hand haptic manipulation such as virtual prototyping, virtual training or video games.


ieee virtual reality conference | 2014

The Mind-Mirror: See your brain in action in your head using EEG and augmented reality

Jonathan Mercier-Ganady; Fabien Lotte; Emilie Loup-Escande; Maud Marchal; Anatole Lécuyer

Imagine you are facing a mirror, seeing at the same time both your real body and a virtual display of your brain in activity and perfectly superimposed to your real image “inside your real skull”. In this paper, we introduce a novel augmented reality paradigm called “Mind-Mirror” which enables the experience of seeing “through your own head”, visualizing your brain “in action and in situ”. Our approach relies on the use of a semi-transparent mirror positioned in front of a computer screen. A virtual brain is displayed on screen and automatically follows the head movements using an optical face-tracking system. The brain activity is extracted and processed in real-time with the help of an electroencephalography cap (EEG) worn by the user. A rear view is also proposed thanks to an additional webcam recording the rear of the users head. The use of EEG classification techniques enables to test a Neurofeedback scenario in which the user can train and progressively learn how to control different mental states, such as “concentrated” versus “relaxed”. The results of a user study comparing a standard visualization used in Neurofeedback to our approach showed that the Mind-Mirror could be successfully used and that the participants have particularly appreciated its innovation and originality. We believe that, in addition to applications in Neurofeedback and Brain-Computer Interfaces, the Mind-Mirror could also be used as a novel visualization tool for education, training or entertainment applications.


symposium on 3d user interfaces | 2010

Walking up and down in immersive virtual worlds: Novel interactive techniques based on visual feedback

Maud Marchal; Anatole Lécuyer; Gabriel Cirio; Laurent Bonnet; Mathieu Emily

We introduce novel interactive techniques to simulate the sensation of walking up and down in immersive virtual worlds based on visual feedback. Our method consists in modifying the motion of the virtual subjective camera while the user is really walking in an immersive virtual environment. The modification of the virtual viewpoint is a function of the variations in the height of the virtual ground. Three effects are proposed: (1) a straightforward modification of the cameras height, (2) a modification of the cameras navigation velocity, (3) a modification of the cameras orientation. They were tested in an immersive virtual reality setup in which the user is really walking. A Desktop configuration where the user is seated and controls input devices was also tested and compared to the real walking configuration. Experimental results show that our visual techniques are very efficient for the simulation of two canonical shapes: bumps and holes located on the ground. Interestingly, a strong ¿orientation-height illusion¿ is found, as changes in pitch viewing orientation produce perception of height changes (although cameras height remains strictly the same in this case). Our visual effects could be applied in various virtual reality applications such as urban or architectural project reviews or training, as well as in videogames, in order to provide the sensation of walking on uneven grounds.


ISBMS '08 Proceedings of the 4th international symposium on Biomedical Simulation | 2008

Towards a Framework for Assessing Deformable Models in Medical Simulation

Maud Marchal; Jérémie Allard; Christian Duriez; Stéphane Cotin

Computational techniques for the analysis of mechanical problems have recently moved from traditional engineering disciplines to biomedical simulations. Thus, the number of complex models describing the mechanical behavior of medical environments have increased these last years. While the development of advanced computational tools has led to interesting modeling algorithms, the relevances of these models are often criticized due to incomplete model verification and validation. The objective of this paper is to propose a framework and a methodology for assessing deformable models. This proposal aims at providing tools for testing the behavior of new modeling algorithms proposed in the context of medical simulation. Initial validation results comparing different modeling methods are reported as a first step towards a more complete validation framework and methodology.


symposium on 3d user interfaces | 2012

The King-Kong Effects: Improving sensation of walking in VR with visual and tactile vibrations at each step

Léo Terziman; Maud Marchal; Franck Multon; Bruno Arnaldi; Anatole Lécuyer

In this paper we present novel sensory feedbacks named ”King-Kong Effects” to enhance the sensation of walking in virtual environments. King Kong Effects are inspired by special effects in movies in which the incoming of a gigantic creature is suggested by adding visual vibrations/pulses to the camera at each of its steps. In this paper, we propose to add artificial visual or tactile vibrations (King-Kong Effects or KKE) at each footstep detected (or simulated) during the virtual walk of the user. The user can be seated, and our system proposes to use vibrotactile tiles located under his/her feet for tactile rendering, in addition to the visual display. We have designed different kinds of KKE based on vertical or lateral oscillations, physical or metaphorical patterns, and one or two peaks for heal-toe contacts simulation. We have conducted different experiments to evaluate the preferences of users navigating with or without the various KKE. Taken together, our results identify the best choices for future uses of visual and tactile KKE, and they suggest a preference for multisensory combinations. Our King-Kong effects could be used in a variety of VR applications targeting the immersion of a user walking in a 3D virtual scene.

Collaboration


Dive into the Maud Marchal's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Loeiz Glondu

École normale supérieure de Cachan

View shared research outputs
Top Co-Authors

Avatar

Bruno Arnaldi

Institut de Recherche en Informatique et Systèmes Aléatoires

View shared research outputs
Top Co-Authors

Avatar

Valérie Gouranton

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Miguel A. Otaduy

King Juan Carlos University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Hiroyuki Kajimoto

University of Electro-Communications

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge