Benoît Penelle
Université libre de Bruxelles
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Benoît Penelle.
Journal of Manual & Manipulative Therapy | 2017
Dominique Mouraux; Eric Brassinne; Stéphane Sobczak; Antoine Nonclercq; Nadine Warzée; Phillip S. Sizer; Turgay Tuna; Benoît Penelle
Objective: We assessed whether or not pain relief could be achieved with a new system that combines 3D augmented reality system (3DARS) and the principles of mirror visual feedback. Methods: Twenty-two patients between 18 and 75 years of age who suffered of chronic neuropathic pain. Each patient performed five 3DARS sessions treatment of 20 mins spread over a period of one week. The following pain parameters were assessed: (1) visual analogic scale after each treatment session (2) McGill pain scale and DN4 questionnaire were completed before the first session and 24 h after the last session. Results: The mean improvement of VAS per session was 29% (p < 0.001). There was an immediate session effect demonstrating a systematic improvement in pain between the beginning and the end of each session. We noted that this pain reduction was partially preserved until the next session. If we compare the pain level at baseline and 24 h after the last session, there was a significant decrease (p < 0.001) of pain of 37%. There was a significant decrease (p < 0.001) on the McGill Pain Questionnaire and DN4 questionnaire (p < 0.01). Conclusion: Our results indicate that 3DARS induced a significant pain decrease for patients who presented chronic neuropathic pain in a unilateral upper extremity. While further research is necessary before definitive conclusions can be drawn, clinicians could implement the approach as a preparatory adjunct for providing temporary pain relief aimed at enhancing chronic pain patients’ tolerance of manual therapy and exercise intervention. Level of Evidence: 4.
virtual reality international conference | 2014
Benoît Penelle; Olivier Debeir
Often presented as competing products on the market of low cost 3D sensors, the Kinect™ and the Leap Motion™ (LM) can actually be complementary in some scenario. We promote, in this paper, the fusion of data acquired by both LM and Kinect sensors to improve hand tracking performances. The sensor fusion is applied to an existing augmented reality system targeting the treatment of phantom limb pain (PLP) in upper limb amputees. With the Kinect we acquire 3D images of the patient in real-time. These images are post-processed to apply a mirror effect along the sagittal plane of the body, before being displayed back to the patient in 3D, giving him the illusion that he has two arms. The patient uses the virtual reconstructed arm to perform given tasks involving interactions with virtual objects. Thanks to the plasticity of the brain, the restored visual feedback of the missing arm allows, in some cases, to reduce the pain intensity. The Leap Motion brings to the system the ability to perform accurate motion tracking of the hand, including the fingers. By registering the position and orientation of the LM in the frame of reference of the Kinect, we make our system able to accurately detect interactions of the hand and the fingers with virtual objects, which will greatly improve the user experience. We also show that the sensor fusion nicely extends the tracking domain by supplying finger positions even when the Kinect sensor fails to acquire the depth values for the hand.
international conference on d imaging | 2011
Benoît Penelle; Arnaud Schenkel; Nadine Warzée
A RGB-D image combines, for each pixel, the classical three color channels with a fourth channel providing depth information. Devices that produce RGB-D images in real time with a rather good resolution are currently available on the market. With this type of device, it is possible to acquire and to process, in real time, 3D textured information, paving the way for numerous applications in the field of computer imaging and vision. In this paper, we analyse the accuracy of a low cost system and we see how this kind of device and the RGB-D images it produces allow us to acquire 3D models of real objects. A first application is presented that combines multiple RGB-D images of a static scene, taken from different viewpoints, in order to reconstruct a complete 3D model of the scene. A second application combines on-the-fly RGB-D images coming from multiple devices, generating a 3D model where the problems of occlusions inherent in monocular observations are drastically reduced.
ICDVRAT 2012 - 9th International Conference on Disability, Virtual Reality and Associated Technologies | 2012
Benoît Penelle; Dominique Mouraux; Eric Brassinne; Antoine Nonclercq; Nadine Warzée
Archive | 2014
Benoît Penelle; Nadine Warzée; Olivier Debeir
1st Symposium on Serious Gaming Technology as Clinical Tool in Rehabilitation | 2014
Dominique Mouraux; Benoît Penelle; Eric Brassinne; Stéphane Sobczak; Antoine Nonclercq; Turgay Tuna; Nadine Warzée
motion in games | 2013
Benoît Penelle; Olivier Debeir
20th Symposium Advances in Prosthetics and Surgical Reconstructions for Hand/Upper Extremity Amputees | 2012
Benoît Penelle; Dominique Mouraux; Eric Brassinne; Antoine Nonclercq; Nadine Warzée
11th Belgian Day on Biomedical Engineering | 2012
Benoît Penelle; Dominique Mouraux; Eric Brassine; Antoine Nonclercq; Nadine Warzée
Archive | 2011
Arnaud Schenkel; Pierre Malarme; Thierry Leloup; David Wikler; Benoît Penelle; Nadine Warzée