Sebastien Grange
École Polytechnique Fédérale de Lausanne
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Sebastien Grange.
international conference on robotics and automation | 2000
Sebastien Grange; Terrence Fong; Charles Baur
Our goal is to make vehicle teleoperation accessible to all users. To do this, we develop easy-to-use yet capable Web tools which enable efficient, robust teleoperation in unknown and unstructured environments. Web-based teleoperation, however, raises many research issues, as well as prohibiting the use of traditional approaches. Thus, it is essential to develop new methods which minimize bandwidth usage, which provide sensor fusion displays, and which optimize human-computer interaction. We believe that existing systems do not adequately address these issues and have severely limited capability and performance as a result. In this paper we present a system design for safe and reliable Web-based vehicle teleoperation, describe an active and dynamic user interface, and explain how our approach differs from existing systems.
computer assisted radiology and surgery | 2003
Gaëtan Marti; Patrice Rouiller; Sebastien Grange; Charles Baur
Abstract The VRAI group at EPFL is conducting research in the fields of virtual reality and haptics (force-feedback) for medical applications. In particular, we have developed visualization techniques for medical images from various sources, and a high-performance haptic interface. In this paper, we present a technique that combines visualization with haptic rendering to provide real-time assistance to medical gestures. To demonstrate this technique, we have developed the BiopsyNavigator, a system that provides haptic feedback to the surgeon using patient specific data. Before the biopsy, it provides the surgeon with the ability to simulate the intervention. During the biopsy, haptic feedback is used to first help the surgeon to find the target and to define the optimal trajectory, then to physically guide the surgical gesture along the chosen path. Finally, haptic information is used to indicate that the target has been reached. Future developments will include real-time update of the patient model from various sources, including C-arm mounted CT and ultrasonic probes.
intelligent robots and systems | 2011
Andreas Tobergte; Patrick Helmer; Ulrich Hagn; Patrice Rouiller; Sophie Thielmann; Sebastien Grange; Alin Albu-Schäffer; Francois Conti; Gerd Hirzinger
This paper presents the design and control of the sigma.7 haptic device and the new surgical console of the MiroSurge robotic system. The console and the haptic devices are designed with respect to requirements in minimally invasive robotic surgery. Dedicated left and right handed devices are integrated in an operator console in an ergonomic configuration. The height of the whole console is adjustable, allowing the surgeon seated and standed operation. Each of the devices is fully actuated in seven degrees of freedom (DoF). A parallel mechanism with 3 DoF actuates the translational motion and an attached wrist with 3 intersecting axis drives the rotations of the grasping unit. This advantageous design leads to inherently decoupled kinematics and dynamics. Cartesian forces are 20 N within the translational workspace, which is a sphere of about 120 mm diameter for each device. The rotational wrist of the device covers the whole workspace of the human hand and provides maximum torques of about 0.4 Nm. The grasping unit can display forces up to 8 N. An integrated force/torque sensor is used to increase the transparency of the devices by reducing inertia and friction. It is theoretically shown that the non-linear closed loop system behaves like a passive system and experimental results validate the approach. The sigma.7 haptic devices are designed by Force Dimension in cooperation with the German Aerospace Center (DLR). DLR designed the surgical console and integrated the haptic devices in the MiroSurge system.
Review of Scientific Instruments | 2005
Marc Jobin; Raphael Foschia; Sebastien Grange; Charles Baur; Gérard Gremaud; Kyumin Lee; L. Forro; A. Kulik
A nanoscale manipulation system has been designed and built through the integration of a force–feedback haptic device and a commercial atomic force microscope. The force–feedback interaction provides a very intuitive, efficient and reliable way for quick manipulation of nanoscale objects. Unlike other nanomanipulators, ours allows the user to feel the actual tip–sample interaction during the manipulation process. Various modes of manipulation have been implemented and evaluated. As a proof of concept, we show a contact-mode nanomanipulation of a carbon nanotube and a noncontact manipulation of silicon beads. In addition to nanomanipulation itself, all relevant signals can be recorded during the manipulation process which allows quantitative interpretation of nanomechanics experiments.
intelligent robots and systems | 2002
Sebastien Grange; Emilio Casanova; Terrence Fong; Charles Baur
This paper describes the development of efficient computer vision techniques for human-computer interaction. Our approach combines range and color information to achieve efficient, robust tracking using consumer-level computer hardware and cameras. In this paper, we present the design of the Human Oriented Tracking (HOT) library, present our initial results, and describe our current efforts to improve HOTs performance through model-based tracking.
international conference on multimodal interfaces | 2004
Sebastien Grange; Terrence Fong; Charles Baur
We propose an architecture for a real-time multimodal system, which provides non-contact, adaptive user interfacing for Computer-Assisted Surgery (CAS). The system, called M/ORIS (for Medical/Operating Room Interaction System) combines gesture interpretation as an explicit interaction modality with continuous, real-time monitoring of the surgical activity in order to automatically address the surgeons needs. Such a system will help reduce a surgeons workload and operation time. This paper focuses on the proposed activity monitoring aspect of M/ORIS. We analyze the issues of Human-Computer Interaction in an OR based on real-world case studies. We then describe how we intend to address these issues by combining a surgical procedure description with parameters gathered from vision-based surgeon tracking and other OR sensors (e.g. tool trackers). We called this approach Scenario-based Activity Monitoring (SAM). We finally present preliminary results, including a non-contact mouse interface for surgical navigation systems.
Intelligent Systems and Advanced Manufacturing | 2001
Sebastien Grange; Francois Conti; Patrick Helmer; Patrice Rouiller; Charles Baur
At the EPFL, we have developed a force-feedback device and control architecture for high-end research and industrial applications. The Delta Haptic Device (DHD) consists of a 6 degrees-of-freedom (DOF) mecatronic device driven by a PC. Several experiments have been carried out in the fields of manipulation and simulation to assess the dramatic improvement haptic information brings to manipulation. This system is particularly well suited for scaled manipulation such as micro-, nano- and biomanipulation. Not only can it perform geometric and force scaling, but it can also include fairly complex physical models into the control loop to assist manipulation and enhance human understanding of the environment. To demonstrate this ability, we are currently interfacing our DHD with an atomic force microscope (AFM). In a first stage, we will be able to feel in real-time the topology of a given sample while visualizing it in 3D. The aim of the project is to make manipulation of carbon nanotubes possible by including physical models of such nanotubes behavior into the control loop, thus allowing humans to control complex structures. In this paper, we give a brief description of our device and present preliminary results of its interfacing with the AFM.
robot and human interactive communication | 2001
Terrence Fong; Sebastien Grange; Charles E. Thorpe; Charles Baur
Multi-robot remote driving has traditionally been a difficult problem. Whenever an operator is often forced to divide his limited resources (attention, decision making, etc.) among multiple robots, control becomes complicated and performance quickly deteriorates as a result. To remedy this, we need to find ways to make command generation and coordination efficient, so that human-robot interaction is transparent and tasks are easy to perform. We discuss the use of collaboration, human-robot dialogue and waypoint-based driving for vehicle teleoperation. We then describe how these techniques can enable a single operator to effectively control multiple mobile robots.
Mobile Robots XV and Telemanipulator and Telepresence Technologies VII | 2001
Terrence Fong; Francois Conti; Sebastien Grange; Charles Baur
Remote driving is a difficult task. Not only do operators have problems perceiving and evaluating the remote environment, but they frequently make incorrect or sub-optimal control decisions. Thus, there is a need to develop alternative approaches which make remote driving easier and more productive. To address this need, we have developed three novel user interfaces: GestureDriver, HapticDriver and PdaDriver. In this paper, we present the motivation for and design of each interface. We also discuss research issues related to the use of gesture, haptics, and palm-size computers for remote driving. Finally, we describe lessons learned, potential applications and planned extensions for each interface.
international conference of the ieee engineering in medicine and biology society | 2004
E. Hagmann; Patrice Rouiller; Patrick Helmer; Sebastien Grange; Charles Baur
We have developed a new navigation approach for computer-assisted interventional radiology. Our system combines a virtual reality display with high-fidelity haptic rendering to provide assistance and guidance of the medical gesture. Specifically, the system is designed to improve the accuracy of blind needle placement within tissues. The proposed technique actively helps the surgeon while keeping him in control of the procedure. We have recently developed an experimental setup for CT-guided biopsy. The setup features a high-precision haptic device connected to the biopsy needle, combined with a graphical interface. The haptic system guides the surgeons hand to the target tissue based on CT data, whereas a real-time, graphical visualization of the tool trajectory provides navigation information. The setup requires rigid registration of the patient with respect to the haptic interface. Tests have been performed in the presence of radiologists to validate the proposed concept, and early results show that the system is easy to use and requires little training. We are planning to conduct clinical testing in the near future to quantitatively assess system performance.