Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Valerio Ortenzi is active.

Publication


Featured researches published by Valerio Ortenzi.


ieee-ras international conference on humanoid robots | 2014

An experimental study of robot control during environmental contacts based on projected operational space dynamics

Valerio Ortenzi; Maxime Adjigble; Jeffrey A. Kuo; Rustam Stolkin; Michael Mistry

This paper addresses the problem of constrained motion for a manipulator performing a task while in contact with the environment, and proposes a solution based on projected operational space dynamics. The main advantages of this control technique are: 1) it exploits the environment contact constraint itself, so as to minimise the joint torques needed to perform the task; 2) it enables full decoupling of motion and force control; 3) force feedback from a force sensor mounted at the end effector or other contact points is not needed. This work is a step towards a robot control strategy which mimics the human behaviour of exploiting contacts with the environment to help perform tasks. We present an experimental implementation of the control method in which a KUKA LWR IV manipulator uses an eraser to wipe a whiteboard, and we show that this controller can effectively exploit contact with the whiteboard in order to reduce joint torques while still performing the desired wiping motion.


intelligent robots and systems | 2016

Vision-guided state estimation and control of robotic manipulators which lack proprioceptive sensors

Valerio Ortenzi; Naresh Marturi; Rustam Stolkin; Jeffrey A. Kuo; Michael Mistry

This paper presents a vision-based approach for estimating the configuration of, and providing control signals for, an under-sensored robot manipulator using a single monocular camera. Some remote manipulators, used for decommissioning tasks in the nuclear industry, lack proprioceptive sensors because electronics are vulnerable to radiation. Additionally, even if proprioceptive joint sensors could be retrofitted, such heavy-duty manipulators are often deployed on mobile vehicle platforms, which are significantly and erratically perturbed when powerful hydraulic drilling or cutting tools are deployed at the end-effector. In these scenarios, it would be beneficial to use external sensory information, e.g. vision, for estimating the robot configuration with respect to the scene or task. Conventional visual servoing methods typically rely on joint encoder values for controlling the robot. In contrast, our framework assumes that no joint encoders are available, and estimates the robot configuration by visually tracking several parts of the robot, and then enforcing equality between a set of transformation matrices which relate the frames of the camera, world and tracked robot parts. To accomplish this, we propose two alternative methods based on optimisation. We evaluate the performance of our developed framework by visually tracking the pose of a conventional robot arm, where the joint encoders are used to provide ground-truth for evaluating the precision of the vision system. Additionally, we evaluate the precision with which visual feedback can be used to control the robots end-effector to follow a desired trajectory.


ieee-ras international conference on humanoid robots | 2016

Kinematics-based estimation of contact constraints using only proprioception

Valerio Ortenzi; Hsiu-Chin Lin; Morteza Azad; Rustam Stolkin; Jeffrey A. Kuo; Michael Mistry

Robots are increasingly being required to perform tasks which involve contacts with the environment. This paper addresses the problem of estimating environmental constraints on the robots motion. We present a method which estimates such constraints, by computing the null space of a set of velocity vectors which differ from commanded velocities during contacts. We further extend this method to handle unilateral constraints, for example when the robot touches a rigid surface. Unlike previous work, our method is based on kinematics analysis, using only proprioceptive joint encoders, thus there is no need for either expensive force-torque sensors or tactile sensors at the contact points or any use of vision. We first show results of experiments with a simulated robot in a variety of situations, and we analyse the effect of various levels of observation noise on the resulting contact estimates. Finally we evaluate the performance of our method on two sets of experiments using a KUKA LWR IV manipulator, tasked with exploring and estimating the constraints caused by a horizontal surface and an inclined surface.


ieee international conference on rehabilitation robotics | 2015

Ultrasound imaging for hand prosthesis control: a comparative study of features and classification methods

Valerio Ortenzi; Sergio Tarantino; Claudio Castellini; Christian Cipriani

Controlling a robotic rehabilitation artefact such as a hand prosthesis is yet a rather open problem. Particularly, the choice of a human-machine interface (HMI) to enable natural control is still debatable. The traditional choice, i.e. surface electromyography (sEMG), suffers from a number of problems (electrode displacement, sweat, fatigue) which cannot be easily solved. One of its main drawbacks is the inherent low spatial resolution, at least in the standard settings. To overcome this hindrance, several novel HMIs have been proposed to substitute or augment sEMG; among them, pressure and tactile sensing, and ultrasound imaging (US). In this paper we propose an advancement towards the usage of US as a HMI for hand prosthetics; namely, we compare traditional US image features with Histograms of Oriented Gradients used as input for three classifiers, and show that a high number of hand configurations and grasping force levels can be classified way above chance level by choosing the right combination of features and classifier. In an experiment involving three intact human subjects, a classification accuracy of 80% was obtained; when classifying three different levels of grip force for four grasps, the performance reduces to 60%. These results confirm the usability of US imaging as a HMI for hand prosthetics, and pave the way to its practical usage as a means of natural prosthetic control.


ieee-ras international conference on humanoid robots | 2016

Model estimation and control of compliant contact normal force

Morteza Azad; Valerio Ortenzi; Hsiu-Chin Lin; Elmar Rueckert; Michael Mistry

This paper proposes a method to realize desired contact normal forces between humanoids and their compliant environment. By using contact models, desired contact forces are converted to desired deformations of compliant surfaces. To achieve desired forces, deformations are controlled by controlling the contact point positions. Parameters of contact models are assumed to be known or estimated using the approach described in this paper. The proposed methods for estimating the contact parameters and controlling the contact normal force are implemented on a LWR KUKA IV arm. To verify both methods, experiments are performed with the KUKA arm while its end-effector is in contact with two different soft objects.


Advanced Robotics | 2017

Hybrid motion/force control: a review

Valerio Ortenzi; Rustam Stolkin; Jeffrey A. Kuo; Michael Mistry

Abstract This paper reviews hybrid motion/force control, a control scheme which enables robots to perform tasks involving both motion, in the free space, and interactive force, at the contacts. Motivated by the large amount of literature on this topic, we facilitate comparison and elucidate the key differences among different approaches. An emphasis is placed on the study of the decoupling of motion control and force control. And we conclude that it is indeed possible to achieve a complete decoupling; however, this feature can be relaxed or sacrificed to reduce the robot’s joint torques while still completing the task. Graphical Abstract


ARC Centre of Excellence for Robotic Vision; School of Electrical Engineering & Computer Science; Science & Engineering Faculty | 2015

Ultrasound imaging for hand prosthesis control: A comparative study of features and classification methods

Valerio Ortenzi; Sergio Tarantino; Claudio Castellini; Christian Cipriani


IEEE-ASME Transactions on Mechatronics | 2018

Vision-Based Framework to Estimate Robot Configuration and Kinematic Constraints

Valerio Ortenzi; Naresh Marturi; Michael Mistry; Jeffrey A. Kuo; Rustam Stolkin


Science & Engineering Faculty | 2015

Projected inverse dynamics control and optimal control for robots in contact with the environment: A comparison

Valerio Ortenzi; Rustam Stolkin; Jeffrey A. Kuo; Michael Mistry


ARC Centre of Excellence for Robotic Vision; School of Electrical Engineering & Computer Science; Science & Engineering Faculty | 2015

A real-time tracking and optimised gaze control for a redundant humanoid robot head

Naresh Marturi; Valerio Ortenzi; Jingjing Xiao; Maxime Adjigble; Rustam Stolkin; Aleš Leonardis

Collaboration


Dive into the Valerio Ortenzi's collaboration.

Top Co-Authors

Avatar

Michael Mistry

University of Birmingham

View shared research outputs
Top Co-Authors

Avatar

Rustam Stolkin

University of Birmingham

View shared research outputs
Top Co-Authors

Avatar

Jeffrey A. Kuo

National Nuclear Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Hsiu-Chin Lin

University of Birmingham

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Morteza Azad

University of Birmingham

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Christian Cipriani

Sant'Anna School of Advanced Studies

View shared research outputs
Top Co-Authors

Avatar

Sergio Tarantino

Sant'Anna School of Advanced Studies

View shared research outputs
Researchain Logo
Decentralizing Knowledge