Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Juan A. Corrales is active.

Publication


Featured researches published by Juan A. Corrales.


human-robot interaction | 2008

Hybrid tracking of human operators using IMU/UWB data fusion by a Kalman filter

Juan A. Corrales; Francisco A. Candelas; Fernando Torres

The precise localization of human operators in robotic workplaces is an important requirement to be satisfied in order to develop human-robot interaction tasks. Human tracking provides not only safety for human operators, but also context information for intelligent human-robot collaboration. This paper evaluates an inertial motion capture system which registers full-body movements of an user in a robotic manipulator workplace. However, the presence of errors in the global translational measurements returned by this system has led to the need of using another localization system, based on Ultra-WideBand (UWB) technology. A Kalman filter fusion algorithm which combines the measurements of these systems is developed. This algorithm unifies the advantages of both technologies: high data rates from the motion capture system and global translational precision from the UWB localization system. The developed hybrid system not only tracks the movements of all limbs of the user as previous motion capture systems, but is also able to position precisely the user in the environment.


Sensors | 2009

Survey of visual and force/tactile control of robots for physical interaction in Spain.

Gabriel J. Garcia; Juan A. Corrales; Jorge Pomares; Fernando Torres

Sensors provide robotic systems with the information required to perceive the changes that happen in unstructured environments and modify their actions accordingly. The robotic controllers which process and analyze this sensory information are usually based on three types of sensors (visual, force/torque and tactile) which identify the most widespread robotic control strategies: visual servoing control, force control and tactile control. This paper presents a detailed review on the sensor architectures, algorithmic techniques and applications which have been developed by Spanish researchers in order to implement these mono-sensor and multi-sensor controllers which combine several sensors.


Robotics and Autonomous Systems | 2010

Sensor data integration for indoor human tracking

Juan A. Corrales; Francisco A. Candelas; Fernando Torres

A human tracking system based on the integration of the measurements from an inertial motion capture system and a UWB (Ultra-Wide Band) location system has been developed. On the one hand, the rotational measurements from the inertial system are used to track precisely all limbs of the body of the human. On the other hand, the translational measurements from both systems are combined by three different fusion algorithms (a Kalman filter, a particle filter and a combination of both) in order to obtain a precise global localization of the human in the environment. Several experiments have been performed to compare their accuracy and computational efficiency.


International Journal of Advanced Robotic Systems | 2011

Direct Visual Servoing to Track Trajectories in Human-Robot Cooperation

Jorge Pomares; Juan A. Corrales; Gabriel J. Garcia; Fernando Torres

This paper describes a dynamic image-based control system to guide two coupled robots. The first robot is a Mitsubishi PA-10 robotic manipulator which has a second mini-robot with 3 degrees of freedom (DOF) attached at its end-effector. The vision system used for the guidance of both robots is composed of a camera at the end-effector of the mini-robot. The paper presents a new method to perform the mini-robot guidance using dynamic control to track a previous generated image trajectory. The mini-robot performs the tracking in a workspace in cooperation with a human operator. Therefore, the proposed visual control is combined with virtual visual servoing to perform a safety behavior.


international conference on control, automation, robotics and vision | 2010

Modelling and simulation of a multi-fingered robotic hand for grasping tasks

Juan A. Corrales; Carlos A. Jara; Fernando Torres

This paper develops the kinematic, dynamic and contact models of a three-fingered robotic hand (BarrettHand) in order to obtain a complete description of the system which is required for manipulation tasks. These models do not only take into account the mechanical coupling and the breakaway mechanism of the under-actuated robotic hand but they also obtain the force transmission from the hand to objects, which are represented as triangle meshes. The developed models have been implemented on a software simulator based on the Easy Java Simulations platform. Several experiments have been performed in order to verify the accuracy of the proposed models with regard to the real physic system.


The International Journal of Advanced Manufacturing Technology | 2009

A cooperative robotic system based on multiple sensors to construct metallic structures.

Pablo Gil; Jorge Pomares; Santiago T. Puente; Francisco A. Candelas; Gabriel J. Garcia; Juan A. Corrales; Fernando Torres

This paper describes a multisensorial robotic system to automatically construct metallic structures. Two robots must work cooperatively in the same workspace to perform the task. The robots are automatically guided using visual and force sensor information. A new time-independent visual force control system which guarantees the adequate robot behaviour during the construction of the structure is described. During the construction of the structure, a human operator works cooperatively with the robots in order to perform some tasks which cannot be automatically developed by the robots. To do so, a new human–robot cooperation approach is described in order to guarantee the human safety. The correct behaviour of the different subsystems proposed in the paper is demonstrated in Section 6 by the construction of a real structure composed of several metallic tubes and different types of pieces to join them.


Sensors | 2011

A Multi-Sensorial Hybrid Control for Robotic Manipulation in Human-Robot Workspaces

Jorge Pomares; Iván Perea; Gabriel J. Garcia; Carlos A. Jara; Juan A. Corrales; Fernando Torres

Autonomous manipulation in semi-structured environments where human operators can interact is an increasingly common task in robotic applications. This paper describes an intelligent multi-sensorial approach that solves this issue by providing a multi-robotic platform with a high degree of autonomy and the capability to perform complex tasks. The proposed sensorial system is composed of a hybrid visual servo control to efficiently guide the robot towards the object to be manipulated, an inertial motion capture system and an indoor localization system to avoid possible collisions between human operators and robots working in the same workspace, and a tactile sensor algorithm to correctly manipulate the object. The proposed controller employs the whole multi-sensorial system and combines the measurements of each one of the used sensors during two different phases considered in the robot task: a first phase where the robot approaches the object to be grasped, and a second phase of manipulation of the object. In both phases, the unexpected presence of humans is taken into account. This paper also presents the successful results obtained in several experimental setups which verify the validity of the proposed approach.


Robotics and Autonomous Systems | 2017

Model-based strategy for grasping 3D deformable objects using a multi-fingered robotic hand

Lazher Zaidi; Juan A. Corrales; Belhassen Chedli Bouzgarrou; Youcef Mezouar; Laurent Sabourin

This paper presents a model-based strategy for 3D deformable object grasping using a multi-fingered robotic hand. The developed contact model is based on two force components (normal force and tangential friction force, including slipping and sticking effects) and uses a non-linear massspring system to describe the object deformations due the mechanical load applied by the fingers of the robotic hand. The objectfinger interaction is simulated in order to compute the required contact forces and deformations to robustly grasp objects with large deformations. Our approach is able to achieve this by using a non-linear model that outperforms current techniques that are limited to using linear models. After the contact forces computed by the simulation of the contact model guarantee the equilibrium of the grasp, they will be used as set-points for force-controlling the closing of the real fingers, and thus, the proposed grasping strategy is implemented. Two different objects (cube and sphere) made from two soft materials (foam and rubber) are tested in order to verify that the proposed model can represent their non-linear deformations and that the proposed grasp strategy can implement a robust grasp of them with a multi-fingered robotic hand equipped with tactile sensors. Thereby, both the grasping strategy and the proposed contact model are validated experimentally.


Archive | 2010

Kalman Filtering for Sensor Fusion in a Human Tracking System

Juan A. Corrales; Francisco A. Candelas; Fernando Torres

Robotic systems need to be context-aware in order to adapt their tasks to the different states of their environment. This context-awareness does not only imply the detection of the objects which are near the robot but it also includes the tracking of people who collaborate with it. Thus, human-robot interaction tasks become more natural and unobtrusive because robots are able to change their behaviour depending on this context information. In industrial environments, these context-aware systems should also guarantee the safety of human operators who interact with industrial robots. Therefore, a precise localization of all the limbs of the body of the operator has to be determined. In this chapter, the use of an inertial motion capture system for tracking full-body movements of the operator is described. It is composed of 18 IMUs (Inertial Measurement Units) attached to the body of the operator which determine the rotation angle of each joint. It has several advantages over other motion capture technologies: easy installation, self-containment, occlusions-fre e and precise rotational measurements. However, it accumulates a small error (drift) in the estimation of the global translation of th e human operator in the environment which becomes considerable after several movements of the operator. Therefore, an additional location system based on UWB (Ultra-Wide Band) signals has been added to correct this drift accumulation. The features of both tracking systems are complementary. The inertial motion capture system registers accurate joint rotation angles at a high rate while the UWB location system estimates global translation in the environment at a low rate. The combination of these systems will reduce the drawbacks of each one with the advantages of the other one. On one hand, the global translation measurements of the UWB system will correct the accumulated drift of the motion capture system. On the other hand, the high rate measurements of the motion capture system will complete the periods of time when there are not any measurements from the UWB system. Firstly, a simple fusion algorithm of both tracking systems is presented. This first fusion algorithm transforms measurements from the two systems in the same coordinate system by recalculating the transformation matrix each time a new measurement from the UWB system is received. This approach relies heavily on the accuracy of the measurements from the UWB system because the transformation matrix recalculation assumes that the last UWB measurement is completely correct. Thus, errors in UWB measurements are not considered and only the translational errors of the motion capture system are corrected. Furthermore,


international conference on computer vision | 2011

Visual control of a multi-robot coupled system: Application to collision avoidance in human-robot interaction

Jorge Pomares; Gabriel J. Garcia; Iván Perea; Juan A. Corrales; Carlos A. Jara; Fernando Torres

The use of visual systems to guide robots in manipulation tasks fails when the grasping tool is close to the target, mainly due to the occlusions produced by the robot tool or by the object geometry. Moreover, the robot controller must take into account the presence of human operators within the workspace who can interact in the robot manipulation task. The scheme proposed in this paper solves this problem by using a mini-robot to locate the camera in order to observe the grasping scene without occlusions. This mini-robot is coupled at the end of a Mitsubishi PA10 manipulator (main robot). The new technical solution proposed for this scheme is composed of two parts. On the one hand, a direct hybrid visual servoing approach to control the position of the mini-robot is proposed. The hybrid behaviour of the controller permits the mini-robot to share its workspace with human operators. On the other hand, a control scheme based on a virtual camera is implemented to guide the main robot with the images acquired by the camera located at the mini-robot. Both approaches, together with the mini-robot building, are described throughout this paper.

Collaboration


Dive into the Juan A. Corrales's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Pablo Gil

University of Alicante

View shared research outputs
Top Co-Authors

Avatar

Youcef Mezouar

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Iván Perea

University of Alicante

View shared research outputs
Top Co-Authors

Avatar

A. Delgado

University of Alicante

View shared research outputs
Researchain Logo
Decentralizing Knowledge