Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Uriel Martinez-Hernandez is active.

Publication


Featured researches published by Uriel Martinez-Hernandez.


intelligent robots and systems | 2013

Active Bayesian perception for angle and position discrimination with a biomimetic fingertip

Uriel Martinez-Hernandez; Tony J. Dodd; Tony J. Prescott; Nathan F. Lepora

In this work, we apply active Bayesian perception to angle and position discrimination and extend the method to perform actions in a sensorimotor task using a biomimetic fingertip. The first part of this study tests active perception off-line with a large dataset of edge orientations and positions, using a Monte Carlo validation to ascertain the classification accuracy. We observe a significant improvement over passive methods that lack a sensorimotor loop for actively repositioning the sensor. The second part of this study then applies these findings about active perception to an example sensorimotor task in real-time. Using an appropriate online sensorimotor control architecture, the robot made decisions about what to do next and where to move next, which was applied to a contour-following task around several objects. The successful outcome of this simple but illustrative task demonstrates that active perception can be of practical benefit for tactile robotics.


international conference on robotics and automation | 2013

Active touch for robust perception under position uncertainty

Nathan F. Lepora; Uriel Martinez-Hernandez; Tony J. Prescott

In this paper, we propose that active perception will help attain autonomous robotics in unstructured environments by giving robust perception. We test this claim with a biomimetic fingertip that senses surface texture under a range of contact depths. We compare the performance of passive Bayesian perception with a novel approach for active perception that includes a sensorimotor loop for controlling sensor position. Passive perception at a single depth gave poor results, with just 0.2mm uncertainty impairing performance. Extending passive perception over a range of depths gave non-robust performance. Only active perception could give robust, accurate performance, with the sensorimotor feedback compensating the position uncertainty. We expect that these results will extend to other stimuli, so that active perception will offer a general approach to robust perception in unstructured environments.


IEEE Transactions on Robotics | 2015

Tactile Superresolution and Biomimetic Hyperacuity

Nathan F. Lepora; Uriel Martinez-Hernandez; Mathew H. Evans; Lorenzo Natale; Giorgio Metta; Tony J. Prescott

Motivated by the impact of superresolution methods for imaging, we undertake a detailed and systematic analysis of localization acuity for a biomimetic fingertip and a flat region of tactile skin. We identify three key factors underlying superresolution that enable the perceptual acuity to surpass the sensor resolution: 1) the sensor is constructed with multiple overlapping, broad but sensitive receptive fields; 2) the tactile perception method interpolates between receptors (taxels) to attain subtaxel acuity; and 3) active perception ensures robustness to unknown initial contact location. All factors follow from active Bayesian perception applied to biomimetic tactile sensors with an elastomeric covering that spreads the contact over multiple taxels. In consequence, we attain extreme superresolution with a 35-fold improvement of localization acuity (0.12 mm) over sensor resolution (4 mm). We envisage that these principles will enable cheap high-acuity tactile sensors that are highly customizable to suit their robotic use. Practical applications encompass any scenario where an end-effector must be placed accurately via the sense of touch.


IEEE Transactions on Autonomous Mental Development | 2013

The Coordinating Role of Language in Real-Time Multimodal Learning of Cooperative Tasks

Maxime Petit; Stéphane Lallée; Jean-David Boucher; Grégoire Pointeau; Pierrick Cheminade; Dimitri Ognibene; Eris Chinellato; Ugo Pattacini; Ilaria Gori; Uriel Martinez-Hernandez; Hector Barron-Gonzalez; Martin Inderbitzin; Andre L. Luvizotto; Vicky Vouloutsi; Yiannis Demiris; Giorgio Metta; Peter Ford Dominey

One of the defining characteristics of human cognition is our outstanding capacity to cooperate. A central requirement for cooperation is the ability to establish a “shared plan”—which defines the interlaced actions of the two cooperating agents—in real time, and even to negotiate this shared plan during its execution. In the current research we identify the requirements for cooperation, extending our earlier work in this area. These requirements include the ability to negotiate a shared plan using spoken language, to learn new component actions within that plan, based on visual observation and kinesthetic demonstration, and finally to coordinate all of these functions in real time. We present a cognitive system that implements these requirements, and demonstrate the systems ability to allow a Nao humanoid robot to learn a nontrivial cooperative task in real-time. We further provide a concrete demonstration of how the real-time learning capability can be easily deployed on a different platform, in this case the iCub humanoid. The results are considered in the context of how the development of language in the human infant provides a powerful lever in the development of cooperative plans from lower-level sensorimotor capabilities.


intelligent robots and systems | 2012

Embodied hyperacuity from Bayesian perception: Shape and position discrimination with an iCub fingertip sensor

Nathan F. Lepora; Uriel Martinez-Hernandez; Hector Barron-Gonzalez; Mathew H. Evans; Giorgio Metta; Tony J. Prescott

Recent advances in modeling animal perception has motivated an approach of Bayesian perception applied to biomimetic robots. This study presents an initial application of Bayesian perception on an iCub fingertip sensor mounted on a dedicated positioning robot. We systematically probed the test system with five cylindrical stimuli offset by a range of positions relative to the fingertip. Testing the real-time speed and accuracy of shape and position discrimination, we achieved sub-millimeter accuracy with just a few taps. This result is apparently the first explicit demonstration of perceptual hyperacuity in robot touch, in that object positions are perceived more accurately than the taxel spacing. We also found substantial performance gains when the fingertip can reposition itself to avoid poor perceptual locations, which indicates that improved robot perception could mimic active perception in animals.


robotics science and systems | 2013

Active Bayesian Perception for Simultaneous Object Localization and Identification

Nathan F. Lepora; Uriel Martinez-Hernandez; Tony J. Prescott

In this paper, we propose that active Bayesian perception has a general role for Simultaneous Object Localization and IDentification (SOLID), or deciding where and what. We test this claim using a biomimetic fingertip to perceive object identity via surface shape at uncertain contact locations. Our method for active Bayesian perception combines decision making by threshold crossing of the posterior belief with a sensorimotor loop that actively controls sensor location based on those beliefs. Our findings include: (i) active perception with a fixation control strategy gives an order-of-magnitude improvement in acuity over passive perception without sensorimotor feedback; (ii) perceptual acuity improves as the active control requires less belief to~make a relocation decision; and (iii) relocation noise further improves acuity. The best method has aspects that resemble animal perception, supporting wide applicability of these findings.


robotics and biomimetics | 2016

An integrated probabilistic framework for robot perception, learning and memory

Uriel Martinez-Hernandez; Andreas C. Damianou; Daniel Camilleri; Luke Boorman; Neil D. Lawrence; Tony J. Prescott

Learning and perception from multiple sensory modalities are crucial processes for the development of intelligent systems capable of interacting with humans. We present an integrated probabilistic framework for perception, learning and memory in robotics. The core component of our framework is a computational Synthetic Autobiographical Memory model which uses Gaussian Processes as a foundation and mimics the functionalities of human memory. Our memory model, that operates via a principled Bayesian probabilistic framework, is capable of receiving and integrating data flows from multiple sensory modalities, which are combined to improve perception and understanding of the surrounding environment. To validate the model, we implemented our framework in the iCub humanoid robotic, which was able to learn and recognise human faces, arm movements and touch gestures through interaction with people. Results demonstrate the flexibility of our method to successfully integrate multiple sensory inputs, for accurate learning and recognition. Thus, our integrated probabilistic framework offers a promising core technology for robust intelligent systems, which are able to perceive, learn and interact with people and their environments.


world haptics conference | 2015

Active haptic shape recognition by intrinsic motivation with a robot hand

Uriel Martinez-Hernandez; Nathan F. Lepora; Tony J. Prescott

In this paper, we present an intrinsic motivation approach applied to haptics in robotics for tactile object exploration and recognition. Here, touch is used as the sensation process for contact detection, whilst proprioceptive information is used for the perception process. First, a probabilistic method is employed to reduce uncertainty present in tactile measurements. Second, the object exploration process is actively controlled by intelligently moving the robot hand towards interesting locations. The active behaviour performed with the robotic hand is achieved by an intrinsic motivation approach, which permitted to improve the accuracy for object recognition over the results obtained by a fixed sequence of exploration movements. The proposed method was validated in a simulated environment with a Monte Carlo method, whilst for the real environment a three-fingered robotic hand and various object shapes were employed. The results demonstrate that our method is robust and suitable for haptic perception in autonomous robotics.


international joint conference on neural network | 2016

Bayesian perception of touch for control of robot emotion

Uriel Martinez-Hernandez; Adrian Rubio-Solis; Tony J. Prescott

In this paper, we present a Bayesian approach for perception of touch and control of robot emotion. Touch is an important sensing modality for the development of social robots, and it is used in this work as stimulus through a human-robot interaction. A Bayesian framework is proposed for perception of various types of touch. This method together with a sequential analysis approach allow the robot to accumulate evidence from the interaction with humans to achieve accurate touch perception for adaptable control of robot emotions. Facial expressions are used to represent the emotions of the iCub humanoid. Emotions in the robotic platform, based on facial expressions, are handled by a control architecture that works with the output from the touch perception process. We validate the accuracy of our system with simulated and real robot touch experiments. Results from this work show that our method is suitable and accurate for perception of touch to control robot emotions, which is essential for the development of sociable robots.


IEEE Transactions on Systems, Man, and Cybernetics | 2017

Feeling the Shape: Active Exploration Behaviors for Object Recognition With a Robotic Hand

Uriel Martinez-Hernandez; Tony J. Dodd; Tony J. Prescott

Autonomous exploration in robotics is a crucial feature to achieve robust and safe systems capable to interact with and recognize their surrounding environment. In this paper, we present a method for object recognition using a three-fingered robotic hand actively exploring interesting object locations to reduce uncertainty. We present a novel probabilistic perception approach with a Bayesian formulation to iteratively accumulate evidence from robot touch. Exploration of better locations for perception is performed by familiarity and novelty exploration behaviors, which intelligently control the robot hand to move toward locations with low and high levels of interestingness, respectively. These are active behaviors that, similar to the exploratory procedures observed in humans, allow robots to autonomously explore locations they believe that contain interesting information for recognition. Active behaviors are validated with object recognition experiments in both offline and real-time modes. Furthermore, the effects of inhibiting the active behaviors are analyzed with a passive exploration strategy. The results from the experiments demonstrate the accuracy of our proposed methods, but also their benefits for active robot control to intelligently explore and interact with the environment.

Collaboration


Dive into the Uriel Martinez-Hernandez's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Luke Boorman

University of Sheffield

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

James Law

University of Sheffield

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Owen McAree

University of Sheffield

View shared research outputs
Top Co-Authors

Avatar

Tony J. Dodd

University of Sheffield

View shared research outputs
Top Co-Authors

Avatar

Adriel Chua

University of Sheffield

View shared research outputs
Researchain Logo
Decentralizing Knowledge