Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Antoine Widmer is active.

Publication


Featured researches published by Antoine Widmer.


systems man and cybernetics | 2010

Effects of the Alignment Between a Haptic Device and Visual Display on the Perception of Object Softness

Antoine Widmer; Yaoping Hu

Virtual reality (VR) has been gaining popularity in surgical planning and simulation. Most VR surgical simulation systems provide haptic (pertinent to the sense of touch) and visual information simultaneously using certain alignments between a haptic device and visual display. A critical aspect of such VR surgical systems is to represent both haptic and visual information accurately to avoid perceptual illusions (e.g., to distinguish the softness of organs/tissues). This study compared three different alignments (same-location alignment, vertical alignment, and horizontal alignment) between a haptic device and visual display that are widely used in VR systems. We conducted three experiments to study the influence of each alignment on the perception of object softness. In each experiment, we tested 15 different human subjects with varying availability of haptic and visual information. During each trial, the task of the subject was to discriminate object softness between two deformable balls in different viewing angles. We analyzed the following dependent measurements: subject perception of object softness and objective measurements of maximum force and maximum pressing depth. The analysis results reveal that all three alignments (independent variables) have similar effect on subjective perception of object softness within the interval of viewing angles from -7.5° to +7.5°. The viewing angle does not affect objective measurements. The same-location alignment requires less physical effort compared with the other two alignments. These observations have implications in creating accurate simulation and interaction for VR surgical systems.


international conference of the ieee engineering in medicine and biology society | 2014

Facilitating medical information search using Google Glass connected to a content-based medical image retrieval system

Antoine Widmer; Roger Schaer; Dimitrios Markonis; Henning Müller

Wearable computing devices are starting to change the way users interact with computers and the Internet. Among them, Google Glass includes a small screen located in front of the right eye, a camera filming in front of the user and a small computing unit. Google Glass has the advantage to provide online services while allowing the user to perform tasks with his/her hands. These augmented glasses uncover many useful applications, also in the medical domain. For example, Google Glass can easily provide video conference between medical doctors to discuss a live case. Using these glasses can also facilitate medical information search by allowing the access of a large amount of annotated medical cases during a consultation in a non-disruptive fashion for medical staff. In this paper, we developed a Google Glass application able to take a photo and send it to a medical image retrieval system along with keywords in order to retrieve similar cases. As a preliminary assessment of the usability of the application, we tested the application under three conditions (images of the skin; printed CT scans and MRI images; and CT and MRI images acquired directly from an LCD screen) to explore whether using Google Glass affects the accuracy of the results returned by the medical image retrieval system. The preliminary results show that despite minor problems due to the relative stability of the Google Glass, images can be sent to and processed by the medical image retrieval system and similar images are returned to the user, potentially helping in the decision making process.


international conference of the ieee engineering in medicine and biology society | 2015

Live ECG readings using Google Glass in emergency situations

Roger Schaer; Fanny Salamin; Oscar Alfonso; Jiménez del Toro; Manfredo Atzori; Henning Müller; Antoine Widmer

Most sudden cardiac problems require rapid treatment to preserve life. In this regard, electrocardiograms (ECG) shown on vital parameter monitoring systems help medical staff to detect problems. In some situations, such monitoring systems may display information in a less than convenient way for medical staff. For example, vital parameters are displayed on large screens outside the field of view of a surgeon during cardiac surgery. This may lead to losing time and to mistakes when problems occur during cardiac operations. In this paper we present a novel approach to display vital parameters such as the second derivative of the ECG rhythm and heart rate close to the field of view of a surgeon using Google Glass. As a preliminary assessment, we run an experimental study to verify the possibility for medical staff to identify abnormal ECG rhythms from Google Glass. This study compares 6 ECG rhythms readings from a 13.3 inch laptop screen and from the prism of Google Glass. Seven medical residents in internal medicine participated in the study. The preliminary results show that there is no difference between identifying these 6 ECG rhythms from the laptop screen versus Google Glass. Both allow close to perfect identification of the 6 common ECG rhythms. This shows the potential of connected glasses such as Google Glass to be useful in selected medical applications.


systems, man and cybernetics | 2011

An evaluation method for real-time soft-tissue model used for multi-vertex palpation

Antoine Widmer; Yaoping Hu

Soft tissue palpation plays an important role in diagnosing various diseases. Palpating skills are tedious to learn due to the difficulty of describing the sense of touch. Because of its interactive nature, a virtual reality (VR) training system embedding with real-time soft-tissue models may be helpful to teach such skills to medical residents. Studies show that such a VR system impacts human perception during palpating at various levels, largely due to the real-time models. Therefore, we propose a formal method for evaluating real-time models considering the human perception. Based upon surface (multi-vertex) contact with 4 force distributions, the evaluation compared a real-time model with a Finite Element Method (FEM) model featuring physical parameters. The comparison consisted of two statistical approaches - ANOVA and Bland and Altman agreement- to assess both visual displacement and force feedback. A case study demonstrated the advantages provided by this evaluation method.


international conference of the ieee engineering in medicine and biology society | 2011

A viscoelastic model of a breast phantom for real-time palpation

Antoine Widmer; Yaoping Hu

Palpation of soft tissues helps to diagnose varying diseases within the tissues. Using a phantom, the current method of training palpation lacks for feedback of the training. Similar to a robot-assisted surgical system, a virtual reality (VR) system could be potential for such training due to its interactive nature. In such a VR system, studies revealed the observation that the human perception of objects is insensitive to subtle discrepancies in a simulation. Based upon this observation, we propose a real-time viscoelastic model of a breast phantom (as soft tissues). The model consists of a surface membrane and an inside gel. We evaluate this model through a comparison with a Finite Element Method (FEM) model, featuring physical parameters and different force contacts. The results show that the model can handle multi vertex force contact on an arbitrary location and yields reasonable accurate deformation compared to the FEM model.


canadian conference on electrical and computer engineering | 2010

Statistical comparison between a real-time model and a FEM counterpart for visualization of breast phantom deformation during palpation

Antoine Widmer; Yaoping Hu

In developing a Virtual Reality simulation for learning breast palpation, one of critical aspects is real-time visualization of breast phantom deformation during palpation. Available models are either offline ones using Finite Element Method (FEM) analysis with considering some material parameters of deformable objects; or real-time ones with difficulties of balancing between this consideration and realistic visualization. For visual perception of breast phantom deformation, we used a real-time model with an inside pressure to keep the volume of the breast phantom constant. On a meshed breast phantom, we compared the displacements of vertices governed by the real-time model with those governed by its FEM counterpart for four different distributions of contact force. To satisfy visual perception of breast phantom deformation, we examined the comparison by utilizing the statistical methods of ANOVA and Bland and Altman agreement. The results revealed that the displacements of vertices governed by the real-time model are in agreement with those by its FEM counterpart for each distribution of contact force. This observation indicates the potential of our real-time model for visualizing breast phantom deformation during palpation.


international conference of the ieee engineering in medicine and biology society | 2012

Difference of perceiving object softness during palpation through single-node and multi-node contacts

Antoine Widmer; Yaoping Hu

Virtual Reality (VR) simulators can offer alternatives for training procedures in the medical field. Most current VR simulators consider single-node contact for interacting with an object to convey displacement and force on a discrete mesh. However, a single-node contact does not closely simulate palpation, which requires a surface made of a multi-node contact to touch a soft object. Thus, we hypothesize that the softness of a deformable object (such as a virtual breast phantom) palpated through a single-node contact would be perceived differently from that of the same phantom palpated through a multi-node contact with various force arrays. We conducted a study to investigate this hypothesis. Using a co-located VR setup that aligns visual and haptic stimuli onto a spatial location, we tested 15 human participants under conditions of both visual and haptic stimuli available and only visual (or haptic) stimulus available. In a trial, each participant palpated and discriminated two virtual breast phantoms of same softness through different contacts with varying force arrays. The results of this study revealed that virtual breast phantoms palpated through a single-node contact were constantly perceived harder than their counterparts palpated through a multi-node contact with varying force arrays, when visual stimuli were available. These results imply a constraint for developing a VR system of training palpation.


2016 IEEE Wireless Health (WH) | 2016

Using smart glasses in medical emergency situations, a qualitative pilot study

Roger Schaer; Thomaz Melly; Henning Müller; Antoine Widmer

Medical emergency situations happening outside a hospital require a large range of competencies from safe transportation of a patient to his/her medical stabilization before the transport. Paramedics are trained to face such situations and can handle most of them very well. Some situations need precise skills and knowledge that are very common in a hospital setting but less in prehospital settings. Currently, paramedics have to work mostly disconnected from hospital skills and knowledge. This may lead to delay of patient care and loss of information from the accident site to the hospital. In this paper, we present a pilot study assessing a new communication platform for prehospital care. With this platform, paramedics can access medical knowledge from hospital specialists directly on the accident site via video conferencing using smart glasses. The platform permits the transmission of vital parameters of a patient without delays so the specialist can follow the patient remotely and advise paramedics simultaneously. The preliminary results show that although the platform adds workload for the paramedics, it can add value for patient care because the emergency physician was more secure in giving advice when he/she could see the video and the vital parameters sent directly from the accident site. Furthermore, the emergency physician saw an added value in the capacity to prepare the arrival of the patient at the hospital, improving the continuity of care.


ieee virtual reality conference | 2009

Subjective Perception and Objective Measurements in Perceiving Object Softness for VR Surgical Systems

Antoine Widmer; Yaoping Hu

A critical issue of virtual reality (VR) surgical systems is to correctly represent both haptic and visual information for distinguishing the softness of organs/tissues. We investigated the relationship between subjective perception of object softness and objective measurements of haptic and visual information. On a co-location VR setup, human subjects pressed deformable balls (simulating organs/tissues) under the conditions of both haptic and visual information available and only haptic (or visual) information available. We recorded and analyzed the subjects selection (subjective perception) of the harder object between two balls and objective measurements of maximum force (haptic) and pressing depth (visual). The results preliminarily indicated that subjective perception behaves differently from objective measurements in perceiving object softness. This has implications for creating accurate simulation in VR surgical systems.


canadian conference on electrical and computer engineering | 2007

Integration of the Senses of Vision and Touch in Perceiving Object Softness

Antoine Widmer; Yaoping Hu

Based upon virtual reality (VR) technologies, a challenging issue for surgical planning is to permit surgeons intuitive and accurate interaction using their sense of vision and touch (e.g. to distinguish the softness of tissues). Since a viewing angle (VA) can influence apparent visual deformation of objects, we hypothesize that the VA would affect the perception of object softness. We conducted an experiment to test this hypothesis and to investigate the mechanisms of integrating the senses of vision and touch. Using a desktop VR setup, we tested 15 human participants for perceiving object softness under 3 conditions: (a) both visual and touch (haptic) information available; (b) only haptic information available; and (c) only visual information available. In each trial, participants had to select the harder object among two deformable balls of same softness but placed in different VAs. Our results showed that the VA affected the perception of object softness (within-subject ANOVA; F = 8.62, p < 0.001) -the larger the VA was the harder the ball was perceived. When two VAs differed at least 15deg, we found a significant difference in perceiving object softness (post-hoc Tukey test, p < 0.05). We applied the method of maximum likelihood estimate to compute the individual and combined weights of visual and haptic information during perceiving object softness. The computation revealed that the visual information was predominant when the VA was at -15deg, whereas the haptic information was prevailing when the VA was +15deg. We also discovered that the variance of both visual and haptic information lay between the individual variances of visual and haptic information, indicating the dependency between visual and haptic information. In conclusion, the VA should never be greater than 15deg to eliminate perceptual illusions. The visual and haptic information depends upon each other, disapproving the assumption of independency in early studies.

Collaboration


Dive into the Antoine Widmer's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Roger Schaer

University of Applied Sciences Western Switzerland

View shared research outputs
Top Co-Authors

Avatar

Dimitrios Markonis

University of Applied Sciences Western Switzerland

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Manfredo Atzori

University of Applied Sciences Western Switzerland

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge