Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Paul Richard is active.

Publication


Featured researches published by Paul Richard.


Virtual Reality | 2006

Multi-modal virtual environments for education with haptic and olfactory feedback

Emmanuelle Richard; Angèle Tijou; Paul Richard; Jean-Louis Ferrier

It has been suggested that immersive virtual reality (VR) technology allows knowledge-building experiences and in this way provides an alternative educational process. Important key features of constructivist educational computer-based environments for science teaching and learning, include interaction, size, transduction and reification. Indeed, multi-sensory VR technology suits very well the needs of sciences that require a higher level of visualization and interaction. Haptics that refers to physical interactions with virtual environments (VEs) may be coupled with other sensory modalities such as vision and audition but are hardly ever associated with other feedback channels, such as olfactory feedback. A survey of theory and existing VEs including haptic or olfactory feedback, especially in the field of education is provided. Our multi-modal human-scale VE VIREPSE (virtual reality platform for simulation and experimentation) that provides haptic interaction using a string-based interface called SPIDAR (space interface device for artificial reality), olfactory and auditory feedbacks is described. An application that allows students experiencing the abstract concept of the Bohr atomic model and the quantization of the energy levels has been developed. Different configurations that support interaction, size and reification through the use of immersive and multi-modal (visual, haptic, auditory and olfactory) feedback are proposed for further evaluation. Haptic interaction is achieved using different techniques ranging from desktop pseudo-haptic feedback to human-scale haptic interaction. Olfactory information is provided using different fan-based olfactory displays (ODs). Significance of developing such multi-modal VEs for education is discussed.


2007 Virtual Rehabilitation | 2007

Augmented Reality for Rehabilitation of Cognitive Disabled Children: A Preliminary Study

Emmanuelle Richard; Valérie Billaudeau; Paul Richard; Gilles Gaudin

We have designed a non-immersive recreational and educational augmented reality application (ARVe - Augmented Reality applied to Vegetal field) that allows young children to handle 2D and 3D plant entities in a simple and intuitive way. This application involves a task of pairing and provides visual, olfactory or auditory cues to help children in decision making. 93 children from a French elementary school (including 11 cognitive disabled ones) participated in a preliminary study. The objectives of this study were: (1) to investigate children performance and behaviour in using AR techniques, and (2) to examine specific attitudes of cognitive disabled children confronted to such techniques. We have observed that disabled children were very enthusiastic when using the application and showed a high motivation compared to most other pupils. Moreover, autistic and trisomic children were able to express some positive emotions when confronted to the application. These very encouraging results promote a widespread use of such tools for cognitive disabled children.


international conference on e-learning and games | 2006

Using olfactive virtual environments for learning organic molecules

Angèle Tijou; Emmanuelle Richard; Paul Richard

A multi-modal virtual reality application that aims to investigate the effect of olfaction on learning, retention, and recall of complex 3D structures such as organic molecules, is presented. Students interact with molecules in either desktop or immersive configuration. In the latter case, visual immersion is achieved through the use of a large rear-projected stereoscopic screen. In both configurations motion parallax is provided using a camera-based head tracking technique. Both desktop and large-scale fan-based devices that allow real-time smell diffusion are used.


ieee international conference on fuzzy systems | 2012

Emotion assessment for affective computing based on physiological responses

Hamza Hamdi; Paul Richard; Aymeric Suteau; Philippe Allain

Information about a users emotional state is a very important aspect of affective interaction with embodied conversational agents. Most research work aims at identifying emotions through speech or facial expressions. However, facial expressions and speech are not continuously available. Furthermore, in some cases, bio-signal data are also required in order to fully assess a users emotional state. We aimed to recognize the six, basic, primary emotions proposed by Ekman, using a widely-available and low-cost brain-computer interface (BCI) and a biofeedback sensor that measures heart rate. We exposed participants to sets of 10 IAPS images that had been partially validated through a subjective rating protocol. Results showed that the collected signals allowed us identifying users emotional state. In addition, a partial correlation between objective and subjective data can be observed.


Teleoperators and Virtual Environments | 2012

A dual-modal virtual reality kitchen for (re)learning of everyday cooking activities in alzheimer's disease

Takehiko Yamaguchi; Déborah Foloppe; Paul Richard; Emmanuelle Richard; Philippe Allain

Everyday action impairment is one of the diagnostic criteria of Alzheimers disease and is associated with many serious consequences, including loss of functional autonomy and independence. It has been shown that the (re)learning of everyday activities is possible in Alzheimers disease by using error reduction teaching approaches in naturalistic clinical settings. The purpose of this study is to develop a dual-modal virtual reality platform for training in everyday cooking activities in Alzheimers disease and to establish its value as a training tool for everyday activities in these patients. Two everyday tasks and two error reduction learning methods were implemented within a virtual kitchen. Two patients with Alzheimers disease and two healthy elderly controls were tested. All subjects were trained in two learning sessions on two comparable cooking tasks. Within each group (i.e., patients and controls), the order of the training methods was counterbalanced. Repeated measure analysis before and after learning was performed. A questionnaire of presence and a verbal interview were used to obtain information about the subjective responses of the participants to the VR experience. The results in terms of errors, omissions, and perseverations (i.e., repetitive behaviors) indicate that the patients performed worse than the controls before learning, but that they reached a level of performance similar to that of the controls after a short learning session, regardless of the learning method employed. This finding provides preliminary support for the value of the dual-modal virtual reality platform for training in everyday cooking activities in Alzheimers disease. However, further work is needed before it is ready for clinical application.


Neuropsychological Rehabilitation | 2018

The potential of virtual reality-based training to enhance the functional autonomy of Alzheimer's disease patients in cooking activities: A single case study

Déborah Foloppe; Paul Richard; Takehiko Yamaguchi; Frédérique Etcharry-Bouyx; Philippe Allain

ABSTRACT Impairments in performing activities of daily living occur early in the course of Alzheimers disease (AD). There is a great need to develop non-pharmacological therapeutic interventions likely to reduce dependency in everyday activities in AD patients. This study investigated whether it was possible to increase autonomy in these patients in cooking activities using interventions based on errorless learning, vanishing-cue, and virtual reality techniques. We recruited a 79-year-old woman who met NINCDS-ADRDA criteria for probable AD. She was trained in four cooking tasks for four days per task, one hour per day, in virtual and in real conditions. Outcome measures included subjective data concerning the therapeutic intervention and the experience of virtual reality, repeated assessments of training activities, neuropsychological scores, and self-esteem and quality of life measures. The results indicated that our patient could relearn some cooking activities using virtual reality techniques. Transfer to real life was also observed. Improvement of the task performance remained stable over time. This case report supports the value of a non-immersive virtual kitchen to help people with AD to relearn cooking activities.


international conference on e-learning and games | 2006

Multi-modal virtual environments for education: from illusion to immersion

Emmanuelle Richard; Angèle Tijou; Paul Richard

A multi-modal Virtual Environment that provides human-scale haptic feedback is described. Users immersion is achieved using a large rear-projected stereoscopic screen, a 5.1 audio system, and fan-based olfactory displays. An educational Virtual Reality application that allows students to experiment the electron bound state in the Bohr atom model and the quantification of the energy levels through the haptic channel, is presented. This application can be run in both immersive and desktop configurations including haptic interaction, ranging from pseudo-haptic illusion to human-scale haptic immersion.


Applied Neuropsychology | 2016

Virtual reality and neuropsychological assessment: The reliability of a virtual kitchen to assess daily-life activities in victims of traumatic brain injury

Jérémy Besnard; Paul Richard; Frédéric Banville; Pierre Nolin; Ghislaine Aubin; Didier Le Gall; Isabelle Richard; Phillippe Allain

ABSTRACT Traumatic brain injury (TBI) causes impairments affecting instrumental activities of daily living (IADL). However, few studies have considered virtual reality as an ecologically valid tool for the assessment of IADL in patients who have sustained a TBI. The main objective of the present study was to examine the use of the Nonimmersive Virtual Coffee Task (NI-VCT) for IADL assessment in patients with TBI. We analyzed the performance of 19 adults suffering from TBI and 19 healthy controls (HCs) in the real and virtual tasks of making coffee with a coffee machine, as well as in global IQ and executive functions. Patients performed worse than HCs on both real and virtual tasks and on all tests of executive functions. Correlation analyses revealed that NI-VCT scores were related to scores on the real task. Moreover, regression analyses demonstrated that performance on NI-VCT matched real-task performance. Our results support the idea that the virtual kitchen is a valid tool for IADL assessment in patients who have sustained a TBI.


International Journal of Human-computer Interaction | 2014

Does Touch Matter?: The Effects of Haptic Visualization on Human Performance, Behavior and Perception

Chang S. Nam; Paul Richard; Takehiko Yamaguchi; Sangwoo Bahn

Chang S. Nam1, Paul Richard2, Takehiko Yamaguchi3, and Sangwoo Bahn4 1Edward P. Fitts Department of Industrial and Systems Engineering, North Carolina State University, Raleigh, North Carolina, USA 2Laboratoire Angevin de Recherche en Ingénierie des Systèmes (LARIS), Université d’Angers, Angers, France 3Department of Applied Electronics, Tokyo University of Science, Tokyo, Japan 4Department of Industrial and Management Engineering, Myongji University, Yongin, Korea


ieee haptics symposium | 2010

Haptic guides in cooperative virtual environments: Design and human performance evaluation

Sehat Ullah; Paul Richard; Samir Otmane; Mickael Naud; Malik Mallem

In this paper we simulate the use of two string based parallel robots in cooperative manipulation task. Two users standing in front of a large screen operate each robot. We propose two haptic guide models, and investigate their effects on cooperation, co-presence and users performance. In addition we also examine the effect of object-based force feedback in cooperative work. Ten volunteer subjects had to cooperatively perform a peg-in-hole task. Results revealed that haptic guides have a significant effect on task execution. They not only increase users performance but also enhance the sense of co-presence and awareness. Our investigations will help in the development of VR systems for cooperative assembly, surgical training and rehabilitation.

Collaboration


Dive into the Paul Richard's collaboration.

Top Co-Authors

Avatar

Takehiko Yamaguchi

Tokyo University of Science

View shared research outputs
Top Co-Authors

Avatar

Philippe Allain

Université du Québec à Trois-Rivières

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Sehat Ullah

University of Malakand

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge