Tim Beyl
Karlsruhe Institute of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Tim Beyl.
international conference on advanced robotics | 2013
Tim Beyl; Philip Nicolai; Jörg Raczkowsky; Heinz Wörn; Mirko Daniele Comparetti; Elena De Momi
Microsoft Kinect cameras are widely used in robotics. The cameras can be mounted either to the robot itself (in case of mobile robotics) or can be placed where they have a good view on robots and/or humans. The use of cameras in the surgical operating room adds additional complexity in placing the cameras and adds the necessity of coping with a highly uncontrolled environment with occlusions and unknown objects. In this paper we present an approach that accurately detects humans using multiple Kinect cameras. Experiments were performed to show that our approach is robust to interference, noise and occlusions. It provides a good detection and identification rate of the user which is crucial for safe human robot cooperation.
robotics and biomimetics | 2011
Holger Mönnich; Philip Nicolai; Tim Beyl; Jörg Raczkowsky; Heinz Wörn
This paper introduces the OP:Sense system that is able to track objects and humans and. To reach this goal a complete surgical robotic system is built up that can be used for telemanipulation as well as for autonomous tasks, e.g. cutting or needle-insertion. Two KUKA lightweight robots that feature seven DOF and allow variable stiffness and damping due to an integrated impedance controller are used as actuators. The system includes two haptic input devices for providing haptic feedback in telemanipulation mode as well as including virtual fixtures to guide the surgeon even during telemanipulation mode. The supervision system consists of a marker-based optical tracking system, Photonic Mixer Device cameras (PMD) and rgb-d cameras (Microsoft Kinect). A simulation environment is constantly updated with the model of the environment, the model of the robots and tracked objects, the occupied space as well as tracked models of humans.
Robot Operating System (ROS) - The Complete Reference (Volume 1). Ed.: A. Koubaa | 2016
Andreas Bihlmaier; Tim Beyl; Philip Nicolai; Mirko Kunze; Julien Mintenbeck; Luzie Schreiter; Thorsten Brennecke; Jessica Hutzl; Jörg Raczkowsky; Heinz Wörn
The case study at hand describes our ROS-based setup for robot-assisted (minimally-invasive) surgery. The system includes different perception components (Kinects, Time-of-Flight Cameras, Endoscopic Cameras, Marker-based Trackers, Ultrasound), input devices (Force Dimension Haptic Input Devices), robots (KUKA LWRs, Universal Robots UR5, ViKY Endoscope Holder), surgical instruments and augmented reality displays. Apart from bringing together the individual components in a modular and flexible setup, many subsystems have been developed based on combinations of the single components. These subsystems include a bimanual telemanipulator, multiple Kinect people tracking, knowledge-based endoscope guidance and ultrasound tomography. The platform is not a research project in itself, but a basic infrastructure used for various research projects. We want to show how to build a large robotics platform, in fact a complete lab setup, based on ROS. It is flexible and modular enough to do research on different robotics related questions concurrently. The whole setup is running on ROS Indigo and Ubuntu Trusty (14.04). A repository of already open sourced components is available at https://github.com/KITmedical.
intelligent robots and systems | 2015
Stefan Escaida Navarro; Franz Heger; Felix Putze; Tim Beyl; Tanja Schultz; Björn Hein
In this paper we show and evaluate the design of a novel telemanipulation system that maps proximity values, acquired inside of a gripper, to forces a user can feel through a haptic input device. The command console is complemented by input-devices that give the user an intuitive control over parameters relevant to the system. Furthermore, proximity sensors enable the autonomous alignment/centering of the gripper to objects in user-selected DoFs with the potential of aiding the user and lowering the workload. We evaluate our approach in a user study that shows that the telemanipulation system benefits from the supplementary proximity information and that the workload can indeed be reduced when the system operates with partial autonomy.
International Journal of Advanced Robotic Systems | 2014
Mirko Daniele Comparetti; Elena De Momi; Tim Beyl; Mirko Kunze; Jörg Raczkowsky; Giancarlo Ferrigno
In surgical procedures, robots can accurately position and orient surgical instruments. Intraoperatively, external sensors can localize the instrument and compute the targeting movement of the robot, based on the transformation between the coordinate frame of the robot and the sensor. This paper addresses the assessment of the robustness of an iterative targeting algorithm in perturbed conditions. Numerical simulations and experiments (with a robot with seven degrees of freedom and an optical tracking system) were performed for computing the maximum error of the rotational part of the calibration matrix, which allows for convergence, as well as the number of required iterations. The algorithm converges up to 50 degrees of error within a large working space. The study confirms the clinical relevance of the method because it can be applied on commercially available robots without modifying the internal controller, thus improving the targeting accuracy and meeting surgical accuracy requirements.
robotics and biomimetics | 2011
Philip Nicolai; Tim Beyl; Holger Mönnich; Jörg Raczkowsky; Heinz Wörn
In this video we show the capabilities of the OP:Sense system. OP:Sense is an integrated rapid application development environment for robot assisted surgery. It mainly aims on MIRS and on open head neurosurgery as OP:Sense is developed for the EU Projects FP7 SAFROS and FP7 ACTIVE that aim on these use-cases. Besides the framework, OP:Sense also integrates applications. Thus it is not only the framework itself but also a system that demonstrates how robots can be used for surgical interventions. Core of the system is the ACE TAO framework [1] [2] that implements realtime CORBA for communication between distributed systems. We built interfaces based on CORBA for use in Matlab and Simulink. Also there are modules for 3D Slicer and applications for the control of devices like robots, or surgical tools. As Matlab is a mighty tool for rapid application development it can be used to develop applications in a faster way compared to using C++ or similar programming languages. We use Matlab for setting up our environment and for tasks and computations that does not need to run in realtime. For Realtime tasks like telemanipulation we use Simulink models.
Studies in health technology and informatics | 2012
Tim Beyl; Philip Nicolai; Holger Mönnich; Raczkowksy J; Heinz Wörn
In current research, haptic feedback in robot assisted interventions plays an important role. However most approaches to haptic feedback only regard the mapping of the current forces at the surgical instrument to the haptic input devices, whereas surgeons demand a combination of medical imaging and telemanipulated robotic setups. In this paper we describe how this feature is integrated in our robotic research platform OP:Sense. The proposed method allows the automatic transfer of segmented imaging data to the haptic renderer and therefore allows enriching the haptic feedback with virtual fixtures based on imaging data. Anatomical structures are extracted from pre-operative generated medical images or virtual walls are defined by the surgeon inside the imaging data. Combining real forces with virtual fixtures can guide the surgeon to the regions of interest as well as helps to prevent the risk of damage to critical structures inside the patient. We believe that the combination of medical imaging and telemanipulation is a crucial step for the next generation of MIRS-systems.
Studies in health technology and informatics | 2016
Tim Beyl; Luzie Schreiter; Philip Nicolai; Jörg Raczkowsky; Heinz Wörn
3D Perception technologies have been explored in various fields. This paper explores the application of such technologies for surgical operating theatres. Clinical applications can be found in workflow detection, tracking and analysis, collision avoidance with medical robots, perception of interaction between participants of the operation, training of the operation room crew, patient calibration and many more. In this paper a complete perception solution for the operating room is shown. The system is based on the ToF technology integrated to the Microsoft Kinect One implements a multi camera approach. Special emphasize is put on the tracking of the personnel and the evaluation of the system performance and accuracy.
International Journal of Computer Assisted Radiology and Surgery | 2013
Philip Nicolai; Thorsten Brennecke; Mirko Kunze; Luzie Schreiter; Tim Beyl; Yaokun Zhang; Julien Mintenbeck; Jörg Raczkowsky; Heinz Wörn
computer assisted radiology and surgery | 2016
Tim Beyl; Philip Nicolai; Mirko Daniele Comparetti; Jörg Raczkowsky; Elena De Momi; Heinz Wörn