Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ruben Smits is active.

Publication


Featured researches published by Ruben Smits.


The International Journal of Robotics Research | 2007

Constraint-based Task Specification and Estimation for Sensor-Based Robot Systems in the Presence of Geometric Uncertainty

Joris De Schutter; Tinne De Laet; Johan Rutgeerts; Wilm Decré; Ruben Smits; Erwin Aertbeliën; Kasper Claes; Herman Bruyninckx

This paper introduces a systematic constraint-based approach to specify complex tasks of general sensor-based robot systems consisting of rigid links and joints. The approach integrates both instantaneous task specification and estimation of geometric uncertainty in a unified framework. Major components are the use of feature coordinates, defined with respect to object and feature frames, which facilitate the task specification, and the introduction of uncertainty coordinates to model geometric uncertainty. While the focus of the paper is on task specification, an existing velocity- based control scheme is reformulated in terms of these feature and uncertainty coordinates. This control scheme compensates for the effect of time varying uncertainty coordinates. Constraint weighting results in an invariant robot behavior in case of conflicting constraints with heterogeneous units. The approach applies to a large variety of robot systems (mobile robots, multiple robot systems, dynamic human-robot interaction, etc.), various sensor systems, and different robot tasks. Ample simulation and experimental results are presented.


international conference on robotics and automation | 2009

Extending iTaSC to support inequality constraints and non-instantaneous task specification

Wilm Decré; Ruben Smits; Herman Bruyninckx; Joris De Schutter

In [1], we presented our constraint-based programming approach, iTaSC1, that formulates instantaneous sensor-based robot tasks as constraint sets, and subsequently solves a corresponding least-squares problem to obtain control set points, such as desired joint velocities or joint torques. This paper further extends this approach, (i) by explicitly supporting the inclusion of inequality constraints in the task and (ii) by supporting a broader class of objective functions for translating the task constraints into robot motion. These extensions are made while retaining a tractable mathematical problem structure (a convex program). Furthermore, first results on extending the approach to non-instantaneous tasks are presented. As illustrated in the paper, the power of the approach lies (i) at its versatility to specify a wide range of robot behaviors and the ease of making task adjustments, and (ii) at its generic nature, that permits using systematic procedures to derive the underlying control equations.


international conference on multisensor fusion and integration for intelligent systems | 2008

iTASC: a tool for multi-sensor integration in robot manipulation

Ruben Smits; T. De Laet; Kasper Claes; Herman Bruyninckx; J. De Schutter

iTASC (acronym for dasiainstantaneous task specification and controlpsila) by J. De Schutter (2007) is a systematic constraint-based approach to specify complex tasks of general sensor-based robot systems. iTASC integrates both instantaneous task specification and estimation of geometric uncertainty in a unified framework. Automatic derivation of controller and estimator equations follows from a geometric task model that is obtained using a systematic task modeling procedure. The approach applies to a large variety of robot systems (mobile robots, multiple robot systems, dynamic human-robot interaction, etc.), various sensor systems, and different robot tasks. Using an example task, this paper shows that iTASC is a powerful tool for multi-sensor integration in robot manipulation. The example task includes multiple sensors: encoders, a force sensor, cameras, a laser distance sensor and a laser scanner. The paper details the systematic modeling procedure for the example task and elaborates on the task specific choice of two types of task coordinates: feature coordinates, defined with respect to object and feature frames, which facilitate the task specification, and uncertainty coordinates to model geometric uncertainty. Experimental results for the example task are presented.


IEEE Robotics & Automation Magazine | 2013

Geometric Relations Between Rigid Bodies (Part 1): Semantics for Standardization

T. De Laet; Steven Bellens; Ruben Smits; Erwin Aertbeliën; Herman Bruyninckx; Joris De Schutter

This tutorial explicitly states the semantics of all coordinate-invariant properties and operations, and, more importantly, all the choices that are made in coordinate representations of these geometric relations. This results in a set of concrete suggestions for standardizing terminology and notation, allowing programmers to write fully unambiguous software interfaces, including automatic checks for semantic correctness of all geometric operations on rigid-body coordinate representations. A concrete proposal for community-driven standardization via the Robot Engineering Task Force [4] is accepted as a Robotics Request for Comment.


international conference on robotics and automation | 2011

Composition of complex robot applications via data flow integration

Ruben Smits; Herman Bruyninckx

Modern and future robotic applications will have to integrate many software components. Inevitably, the components will be provided by different vendors, and will not be designed together, and in many cases also not implemented from the same source code repository. This paper presents a discussion on how to deal with this multi-component, multi-vendor situation. This situation includes the least optimal case when most or all of the components are “legacy”, in the sense that they have to be integrated as binary pieces of code, without the possibility to change the components for the sole purpose of facilitating system-level integration. The presented approach is illustrated by an application that integrates (i) two KUKA Light-Weight Robot FRI control components, (ii) a set of ROS components to control a Willow Garage PR2 robot, (iii) a Blender component for on-line visualization, and (iv) a set of Orocos/RTT components that take care of the data flow communication and some simple coordination between all components.


intelligent robots and systems | 2011

Reusable hybrid force-velocity controlled motion specifications with executable Domain Specific Languages

Markus Klotzbücher; Ruben Smits; Herman Bruyninckx; Joris De Schutter

Most of todays robotic task descriptions are designed for a single software and hardware platform and thus can not be reused without modifications. This work follows the meta-model approach of Model Driven Engineering (MDE) to introduce the concepts of Domain Specific Languages (DSL) and of Model Transformations to the domain of hybrid force-velocity controlled robot tasks, as expressed in (i) the Task Frame formalism (TFF), and (ii) a Statechart model representing the discrete coordination between TFF tasks. The result is a representation in MDEs M0, M1, M2 and M3 form, with increasingly robot and software independent representations, that do remain instantaneously executable, except obviously for the M3 metametamodel. The Platform Specific Model information can be added in three steps: (i) the type of the hybrid force-velocity controlled task, (ii) the hardware properties of the robot, tool and sensor, and (iii) the software properties of the applied execution framework. We demonstrate the presented approach by means of an alignment task executed on a Willow Garage PR2 and a KUKA Light Weight Robot (LWR) arm.


Archive | 2008

Image-Based Visual Servoing with Extra Task Related Constraints in a General Framework for Sensor-Based Robot Systems

Ruben Smits; Duccio Fioravanti; Tinne De Laet; Benedetto Allotta; Herman Bruyninckx; Joris De Schutter

Robotic tasks of limited complexity, such as simple positioning tasks, trajectory following or pick-and-place applications in well structured environments, are straightforward to program. For these kinds of tasks extensive programming support is available, as the specification primitives for these tasks are present in current commercial robot control software. While these robot capabilities already fulfill some industrial needs, research focuses on specification and execution of much more complex tasks. The goal of our recent research is to open up new robot applications in industrial as well as domestic and service environments. Examples of complex tasks include sensor-based navigation, like visual servoing and 3D manipulation in partially or completely unknown environments, using redundant robotic systems such as mobile manipulator arms, cooperating robots, robotic hands or humanoid robots, and using multiple sensors such as vision, force, torque, tactile and distance sensors. Little programming support is available for these kinds of tasks. As a result, the task programmer has to rely on extensive knowledge in multiple fields such as image processing, spatial kinematics, 3D modeling of objects, geometric uncertainty and sensor systems, dynamics and control, estimation, as well as resolution of redundancy and of conflicting constraints. The goal of our recent research is to fill this gap. We want to develop programming support for the implementation of complex, sensor-based robotic tasks in the presence of geometric uncertainty. The foundation for this programming support is a


intelligent robots and systems | 2011

Haptic coupling with augmented feedback between two KUKA Light-Weight Robots and the PR2 robot arms

Koen Buys; Steven Bellens; Wilm Decré; Ruben Smits; Enea Scioni; Tinne De Laet; Joris De Schutter; Herman Bruyninckx

This paper discusses the theoretical background and practical implementation of a large-scale, low-performance haptic remote control setup. The experimental system consists of a pair of KUKA Light Weight Robots (LWR) coupled to a Willow Garage Personal Robot (PR2) via two different robotic frameworks. The haptic “performance” is, of course, not comparable to dedicated haptic applications, but has its use as a test-bed for interaction between “legacy” service robot systems, that have not been especially designed for mutual haptic interaction. We discuss some major application problems, and the future work needed for nonuniform robot coupling. Beside haptic coupling, we provide the human operator with visual feedback. To this end, the head movements of the human operator are coupled to the head movement of the PR2 and the images of the eye cameras are displayed to the human operator using a wearable display. The presented teleoperation application is furthermore an example of the integration of two component-based robotic frameworks namely OROCOS (Open Robot Control Software)and ROS (Robot Operating System) Experimental results regarding the haptic coupling are presented using an “artistic” painting task for qualitative results, and a hard contact at the slave side for quantitative results.


Automatisierungstechnik | 2012

Constraint-Based Task Specification and Control for Visual Servoing Application Scenarios

Tinne De Laet; Ruben Smits; Herman Bruyninckx; Joris De Schutter

Abstract This paper reformulates image-based visual servoing as a constraint-based robot task, in order to integrate it seamlessly with other task constraints in image space, in Cartesian space, in the joint space of the robot, or in the “image space” of any other sensor (e. g. force, distance). This approach allows us to fuse various kinds of sensor data. The integration takes place via the specification of generic “feature coordinates”, defined in the different task spaces. Independent control loops are defined to control the individual feature coordinate setpoints, in each of these task spaces. The outputs of the control loops are instantaneously combined into joint velocity setpoints for a velocity-controlled robot that executes the task. The paper includes experimental results for different application scenarios. Zusammenfassung In diesem Beitrag wird das Konzept des bildgestützten Visual-Servoing als eine beschränkungsbasierte Roboter-Aufgabe umformuliert, um es nahtlos mit anderen Aufgabenbeschränkungen im Bildkoordinatensystem, im kartesischen oder Gelenkkoordinatensystem integrieren zu können. Die Integration wird über die Spezifikation von Merkmalskoordinaten durchgeführt, die in den verschiedenen Aufgabenräumen definiert sind. In jedem dieser Aufgabenräume werden Regelungsschleifen definiert, um die einzelnen Sollkoordinaten der Merkmale selektiv zu steuern. Die Kombination der Ausgänge aller Teilregelkreise bilden die Sollwerte des geschwindigkeitsgeregelten Roboters. Experimentelle Ergebnisse für unterschiedliche Anwendungsszenarien demonstrieren die Funktion und Leistungsfähigkeit des vorgeschlagenen erweiterten Visual-Servoing-Konzeptes.


international symposium on experimental robotics | 2009

Adaptive Full Scan Model for Range Finders in Dynamic Environments

Tinne De Laet; Ruben Smits; Joris De Schutter; Herman Bruyninckx

Sensor models directly influence the efficiency and robustness of the estimation processes used in robot and object localization. This paper focuses on a probabilistic range finder sensor model for dynamic environments. The dynamic nature results from the presence of unmodeled and possibly moving objects and people. The goal of this paper is twofold. First, we present experiments to validate the Rigorously Bayesian Beam Model (RBBM), a new model we proposed in a previous paper. Second, we propose a sample-based full scan model to improve the state of the art models. In contrast to these Gaussian-based state of the art full scan models, the proposed model is able to handle the multi-modality of the range finder data, which is shown here to occur even in simple static environments.

Collaboration


Dive into the Ruben Smits's collaboration.

Top Co-Authors

Avatar

Herman Bruyninckx

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Joris De Schutter

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Tinne De Laet

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Koen Buys

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Wilm Decré

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Steven Bellens

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Markus Klotzbücher

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Dominick Vanthienen

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Erwin Aertbeliën

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Kasper Claes

Katholieke Universiteit Leuven

View shared research outputs
Researchain Logo
Decentralizing Knowledge