Ali Sengül
École Polytechnique Fédérale de Lausanne
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Ali Sengül.
PLOS ONE | 2012
Ali Sengül; Michiel van Elk; Giulio Rognini; Jane E. Aspell; Hannes Bleuler; Olaf Blanke
The effects of real-world tool use on body or space representations are relatively well established in cognitive neuroscience. Several studies have shown, for example, that active tool use results in a facilitated integration of multisensory information in peripersonal space, i.e. the space directly surrounding the body. However, it remains unknown to what extent similar mechanisms apply to the use of virtual-robotic tools, such as those used in the field of surgical robotics, in which a surgeon may use bimanual haptic interfaces to control a surgery robot at a remote location. This paper presents two experiments in which participants used a haptic handle, originally designed for a commercial surgery robot, to control a virtual tool. The integration of multisensory information related to the virtual-robotic tool was assessed by means of the crossmodal congruency task, in which subjects responded to tactile vibrations applied to their fingers while ignoring visual distractors superimposed on the tip of the virtual-robotic tool. Our results show that active virtual-robotic tool use changes the spatial modulation of the crossmodal congruency effects, comparable to changes in the representation of peripersonal space observed during real-world tool use. Moreover, when the virtual-robotic tools were held in a crossed position, the visual distractors interfered strongly with tactile stimuli that was connected with the hand via the tool, reflecting a remapping of peripersonal space. Such remapping was not only observed when the virtual-robotic tools were actively used (Experiment 1), but also when passively held the tools (Experiment 2). The present study extends earlier findings on the extension of peripersonal space from physical and pointing tools to virtual-robotic tools using techniques from haptics and virtual reality. We discuss our data with respect to learning and human factors in the field of surgical robotics and discuss the use of new technologies in the field of cognitive neuroscience.
European Journal of Neuroscience | 2013
Giulio Rognini; Ali Sengül; Jane E. Aspell; Roy Salomon; Hannes Bleuler; Olaf Blanke
Although there is increasing knowledge about how visual and tactile cues from the hands are integrated, little is known about how self‐generated hand movements affect such multisensory integration. Visuo‐tactile integration often occurs under highly dynamic conditions requiring sensorimotor updating. Here, we quantified visuo‐tactile integration by measuring cross‐modal congruency effects (CCEs) in different bimanual hand movement conditions with the use of a robotic platform. We found that classical CCEs also occurred during bimanual self‐generated hand movements, and that such movements lowered the magnitude of visuo‐tactile CCEs as compared to static conditions. Visuo‐tactile integration, body ownership and the sense of agency were decreased by adding a temporal visuo‐motor delay between hand movements and visual feedback. These data show that visual stimuli interfere less with the perception of tactile stimuli during movement than during static conditions, especially when decoupled from predictive motor information. The results suggest that current models of visuo‐tactile integration need to be extended to account for multisensory integration in dynamic conditions.
Applied Bionics and Biomechanics | 2010
Laura Santos-Carreras; Ricardo Beira; Ali Sengül; Roger Gassert; Hannes Bleuler
The introduction of Minimally Invasive Surgery MIS has revolutionised surgical care, considerably improving the quality of many surgical procedures. Technological advances, particularly in robotic surgery systems, have reduced the complexity of such an approach, paving the way for even less invasive surgical trends. However, the fact that haptic feedback has been progressively lost through this transition is an issue that to date has not been solved. Whereas traditional open surgery provides full haptic feedback, the introduction of MIS has eliminated the possibility of direct palpation and tactile exploration. Nevertheless, these procedures still provide a certain amount of force feedback through the rigid laparoscopic tool. Many of the current telemanipulated robotic surgical systems in return do not provide full haptic feedback, which to a certain extent can be explained by the requirement of force sensors integrated into the tools of the slave robot and actuators in the surgeons master console. In view of the increased complexity and cost, the benefit of haptic feedback is open to dispute. Nevertheless, studies have shown the importance of haptic feedback, especially when visual feedback is unreliable or absent. In order to explore the importance of haptic feedback for the surgeons master console of a novel teleoperated robotic surgical system, we have identified a typical surgical task where performance could potentially be improved by haptic feedback, and investigate performance with and without this feedback. Two rounds of experiments are performed with 10 subjects, six of them with a medical background. Results show that feedback conditions, including force feedback, significantly improve task performance independently of the operators suturing experience. There is, however, no further significant improvement when torque feedback is added. Consequently, it is deduced that force feedback in translations improves subjects dexterity, while torque feedback might not further benefit such a task.
Experimental Brain Research | 2013
Ali Sengül; Giulio Rognini; Michiel van Elk; Jane E. Aspell; Hannes Bleuler; Olaf Blanke
The present study investigated the effects of force feedback in relation to tool use on the multisensory integration of visuo-tactile information. Participants learned to control a robotic tool through a surgical robotic interface. Following tool-use training, participants performed a crossmodal congruency task, by responding to tactile vibrations applied to their hands, while ignoring visual distractors superimposed on the robotic tools. In the first experiment it was found that tool-use training with force feedback facilitates multisensory integration of signals from the tool, as reflected in a stronger crossmodal congruency effect with the force feedback training compared to training without force feedback and to no training. The second experiment extends these findings by showing that training with realistic online force feedback resulted in a stronger crossmodal congruency effect compared to training in which force feedback was delayed. The present study highlights the importance of haptic information for multisensory integration and extends findings from classical tool-use studies to the domain of robotic tools. We argue that such crossmodal congruency effects are an objective measure of robotic tool integration and propose some potential applications in surgical robotics, robotic tools, and human–tool interaction.
international conference on human system interactions | 2011
Laura Santos-Carreras; Ali Sengül; Marc Vollenweider; Hannes Bleuler
The coming years will see a rapid development of telemanipulators for robotic surgery. These systems will open up new possibilities in surgery. Patient safety and acceptance among surgeons are two key issues in this context. We believe that haptic feedback at the console will be an essential feature for both, increased safety and acceptance among surgeons. We aim at giving the surgeon the feeling that he or she is directly operating with his/her own hands on the patients body. For this, multimodal haptic feedback will be necessary, i.e. the merging of force-feedback, temperature sensing, pressure sensing, and texture rendering, in addition to an ergonomic design of the console itself. We will present some preliminary designs of such systems which are currently being developed in the framework of several projects.
international conference on control, automation and systems | 2010
Mathieu Stephan; Giulio Rognini; Ali Sengül; Ricardo Beira; Laura Santos-Carreras; Hannes Bleuler
This paper reports the design of a Minimally Invasive Surgery (MIS) gripper with four degrees of freedom force sensing capabilities. It will be used to provide force feedback during surgical interventions in which the surgeon will remotely manipulate surgical instruments through the use of a robotic arm directly inserted into the patients insufflated abdominal cavity. Suturing, dissection and ablation instruments will be attached on this 8 mm× 9 mm× 3 mm MIS gripper. Finite Element Analysis is used to model the gripper and determine the deformation matrix coefficients. Gripping and XYZ Cartesian direction applied forces can be measured with a resolution of 0.1N for a maximum force of 10N. However a significant difference between the predicted values by the Finite Element model and those obtained in the characterization of the force sensor is found. This divergence is due to misalignments of the strain gages located on the blades of the gripper. Future work will be focused on reducing misalignment of force sensors as well as other error sources.
intelligent autonomous systems | 2013
Ali Sengül; Attila Barsi; David Ribeiro; Hannes Bleuler
In this study, the role of different 3D vision systems on the patient safety in the context of robotic surgery was studied. Clearly safety is the foremost importance in all surgical procedures. It is well studied in the clinical surgical procedures but the role of different 3D vision system in the context of patient safety is hardly ever mentioned. The assessment of the quality of the 3D images and role of force feedback was studied with two distinct methods (spatial estimation and depth perception) in two different vision systems (holographic and stereovision). The main idea in this study is to investigate quantitatively the role of the vision system in patient safety.
international conference on human haptic sensing and touch enabled computer applications | 2018
Ali Sengül; Michiel van Elk; Olaf Blanke; Hannes Bleuler
Effective tool use relies on the integration of multisensory signals related to one’s body and the tool. It has been shown that active tool use results in an extension of peripersonal space, i.e., the space directly surrounding the human body. In the present studies we investigated whether the mere observation of a virtual tool that could be manipulated via a haptic robotic interface would also affect the perception of peripersonal space. Participants passively observed a tool being used (Study 1) and received simple visuotactile feedback related to the tool (Study 2). We assessed the extension of peripersonal space by using the crossmodal congruency task, which measures the interference of observed visual distractors presented at the tool on judgments about tactile stimuli presented to the fingers. We found that passive observation of tool use resulted in a crossmodal congruency effect for both crossed and uncrossed arm/tool use postures (Study 1). This effect was even more pronounced when participants were presented with simple visuo-tactile feedback during the observation phase (Study 2). These findings suggest that additional visuotactile feedback enhances the integration of the tools into the body schema. We discuss the relevance of these findings for the development of surgical robotics, virtual tool use and for motor rehabilitation.
international conference on human haptic sensing and touch enabled computer applications | 2018
Mohssen Hosseini; Yudha Pane; Ali Sengül; Joris De Schutter; Herman Bruyninckx
A compact and light-weight wearable haptic glove (ExoTen-Glove) based on Twisted String Actuation (TSA) system is presented in this paper. The proposed system uses two actuators with small size DC motors and an integrated force sensor based on optoelectronic components. ExoTen-Glove can provide force feedback to the thumb on one side, and to the other fingers grouped together on the other side. This configuration has been selected to provide the user force feedback during the execution of grasping tasks by means of a virtual reality environment to feel the stiffness of different objects. Thus for the first evaluation of the ExoTen-Glove, we only focus on the feedback from thumb and index finger. The paper reports the design of the haptic glove, the description of the actuation system, the embedded controller, and the preliminary experimental evaluation of the device. The ExoTen-Glove has been evaluated by means of a simple experiment in virtual environment with 2-DOF grasping activities of rigid and compliant virtual object (spring) using thumb and index finger to show the applicability of the system for rehabilitation and haptic feedback purposes. Results of the experiments showed that the haptic ExoTen-Glove improved stiffness evaluation significantly for the high and low spring stiffness and users were able to distinguish virtual spring stiffness differences easily with high accuracy.
Studies in health technology and informatics | 2012
Hannes Bleuler; Laura Santos-Carreras; Ali Sengül; Giulio Rognini