Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Joao Bimbo is active.

Publication


Featured researches published by Joao Bimbo.


intelligent robots and systems | 2012

Surface material recognition through haptic exploration using an intelligent contact sensing finger

Hongbin Liu; Xiaojing Song; Joao Bimbo; Lakmal D. Seneviratne; Kaspar Althoefer

Object surface properties are among the most important information which a robot requires in order to effectively interact with an unknown environment. This paper presents a novel haptic exploration strategy for recognizing the physical properties of unknown object surfaces using an intelligent finger. This developed intelligent finger is capable of identifying the contact location, normal and tangential force, and the vibrations generated from the contact in real time. In the proposed strategy, this finger gently slides along the surface with a short stroke while increasing and decreasing the sliding velocity. By applying a dynamic friction model to describe this contact, rich and accurate surface physical properties can be identified within this stroke. This allows different surface materials to be easily distinguished even if when they have very similar texture. Several supervised learning algorithms have been applied and compared for surface recognition based on the obtained surface properties. It has been found that the naïve Bayes classifier is superior to radial basis function network and k-NN method, achieving an overall classification accuracy of 88.5% for distinguishing twelve different surface materials.


international conference on multisensor fusion and integration for intelligent systems | 2012

Tactile image based contact shape recognition using neural network

Hongbin Liu; Juan Greco; Xiaojing Song; Joao Bimbo; Lakmal D. Seneviratne; Kaspar Althoefer

This paper proposes a novel algorithm for recognizing the shape of object which in contact with a robotic finger through the tactile pressure sensing. The developed algorithm is capable of distinguishing the contact shapes between a set of low-resolution pressure map. Within this algorithm, a novel feature extraction technique is developed which transforms a pressure map into a 512-feature vector. The extracted feature of the pressure map is invariant to scale, positioning and partial occlusion, and is independent of the sensors resolution or image size. To recognize different contact shape from a pressure map, a neural network classifier is developed and uses the feature vector as inputs. It has proven from tests of using four different contact shapes that, the trained neural network can achieve a high success rate of over 90%. Contact sensory information plays a crucial role in robotic hand gestures. The algorithm introduced in this paper has the potential to provide valuable feedback information to automate and improve robotic hand grasping and manipulation.


Autonomous Robots | 2015

Finger contact sensing and the application in dexterous hand manipulation

Hongbin Liu; Kien Cuong Nguyen; Véronique Perdereau; Joao Bimbo; Junghwan Back; Matthew Godden; Lakmal D. Seneviratne; Kaspar Althoefer

In this paper we introduce a novel contact-sensing algorithm for a robotic fingertip which is equipped with a 6-axis force/torque sensor and covered with a deformable rubber skin. The design and the sensing algorithm of the fingertip for effective contact information identification are introduced. Validation tests show that the contact sensing fingertip can estimate contact information, including the contact location on the fingertip, the direction and the magnitude of the friction and normal forces, the local torque generated at the surface, at high speed (158–242 Hz) and with high precision. Experiments show that the proposed algorithm is robust and accurate when the friction coefficient


intelligent robots and systems | 2013

Combining touch and vision for the estimation of an object's pose during manipulation

Joao Bimbo; Lakmal D. Seneviratne; Kaspar Althoefer; Hongbin Liu


international conference on robotics and automation | 2014

Novel uniaxial force sensor based on visual information for minimally invasive surgery

Angela Faragasso; Joao Bimbo; Yohan Noh; Allen Jiang; Sina Sareh; Hongbin Liu; Thrishantha Nanayakkara; Helge A. Wurdemann; Kaspar Althoefer

\le


international conference of the ieee engineering in medicine and biology society | 2014

Endoscopic add-on stiffness probe for real-time soft surface characterisation in MIS.

Angela Faragasso; Agostino Stilli; Joao Bimbo; Yohan Noh; Hongbin Liu; Thrishantha Nanayakkara; Prokar Dasgupta; Helge A. Wurdemann; Kaspar Althoefer


Advanced Robotics | 2015

Global estimation of an object’s pose using tactile sensing

Joao Bimbo; Petar Kormushev; Kaspar Althoefer; Hongbin Liu

≤1. Obtaining such contact information in real-time are essential for fine object manipulation. Using the contact sensing fingertip for surface exploration has been demonstrated, indicating the advantage gained by using the identified contact information from the proposed contact-sensing method.


intelligent robots and systems | 2012

A novel dynamic slip prediction and compensation approach based on haptic surface exploration

Xiaojing Song; Hongbin Liu; Joao Bimbo; Kaspar Althoefer; Lakmal D. Seneviratne

Robot grasping and manipulation relies mainly on two types of sensory data: vision and tactile sensing. Localisation and recognition of the object is typically done through vision alone, while tactile sensors are commonly used for grasp control. Vision performs reliably in uncluttered environments, but its performance may deteriorate when the object is occluded, which is often the case during a manipulation task, when the object is in-hand and the robot fingers stand between the camera and the object. This paper presents a method to use the robots sense of touch to refine the knowledge of a manipulated objects pose from an initial estimate provided by vision. The objective is to find a transformation on the objects location that is coherent with the current proprioceptive and tactile sensory data. The method was tested with different object geometries and proposes applications where this method can be used to improve the overall performance of a robotic system. Experimental results show an improvement of around 70% on the estimate of the objects location when compared to using only vision.


international conference on multisensor fusion and integration for intelligent systems | 2012

Object pose estimation and tracking by fusing visual and tactile information

Joao Bimbo; Silvia Rodríguez-Jiménez; Hongbin Liu; Xiaojing Song; Nicolas Burrus; Lakmal Senerivatne; Mohamed Abderrahim; Kaspar Althoefer

This paper presents an innovative approach of utilising visual feedback to determine physical interaction forces with soft tissue during Minimally Invasive Surgery (MIS). This novel force sensing device is composed of a linear retractable mechanism and a spherical visual feature. The sensor mechanism can be adapted to endoscopic cameras used in MIS. As the distance between the camera and feature varies due to the sliding joint, interaction forces with anatomical surfaces can be computed based on the visual appearance of the feature in the image. Hence, this device allows the measurement of forces without introducing new stand-alone sensors. A mathematical model was derived based on validation data tests and preliminary experiments were conducted to verify the models accuracy. Experimental results confirm the effectiveness of our vision based approach.


Sensors | 2016

Multi-Axis Force/Torque Sensor Based on Simply-Supported Beam and Optoelectronics.

Yohan Noh; Joao Bimbo; Sina Sareh; Helge A. Wurdemann; Jan Fraś; Damith Suresh Chathuranga; Hongbin Liu; James Housden; Kaspar Althoefer; Kawal S. Rhode

This paper explores a novel stiffness sensor which is mounted on the tip of a laparoscopic camera. The proposed device is able to compute stiffness when interacting with soft surfaces. The sensor can be used in Minimally Invasive Surgery, for instance, to localise tumor tissue which commonly has a higher stiffness when compared to healthy tissue. The purely mechanical sensor structure utilizes the functionality of an endoscopic camera to the maximum by visually analyzing the behavior of trackers within the field of view. Two pairs of spheres (used as easily identifiable features in the camera images) are connected to two springs with known but different spring constants. Four individual indenters attached to the spheres are used to palpate the surface. During palpation, the spheres move linearly towards the objective lens (i.e. the distance between lens and spheres is changing) resulting in variations of their diameters in the camera images. Relating the measured diameters to the different spring constants, a developed mathematical model is able to determine the surface stiffness in real-time. Tests were performed using a surgical endoscope to palpate silicon phantoms presenting different stiffness. Results show that the accuracy of the sensing system developed increases with the softness of the examined tissue.

Collaboration


Dive into the Joao Bimbo's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Kaspar Althoefer

Queen Mary University of London

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Lakmal D. Seneviratne

University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Yohan Noh

King's College London

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge