Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Wenzhen Yuan is active.

Publication


Featured researches published by Wenzhen Yuan.


intelligent robots and systems | 2014

Localization and Manipulation of Small Parts Using GelSight Tactile Sensing

Rui Li; Robert Platt; Wenzhen Yuan; Andreas ten Pas; Nathan Roscup; Mandayam A. Srinivasan; Edward H. Adelson

Robust manipulation and insertion of small parts can be challenging because of the small tolerances typically involved. The key to robust control of these kinds of manipulation interactions is accurate tracking and control of the parts involved. Typically, this is accomplished using visual servoing or force-based control. However, these approaches have drawbacks. Instead, we propose a new approach that uses tactile sensing to accurately localize the pose of a part grasped in the robot hand. Using a feature-based matching technique in conjunction with a newly developed tactile sensing technology known as GelSight that has much higher resolution than competing methods, we synthesize high-resolution height maps of object surfaces. As a result of these high-resolution tactile maps, we are able to localize small parts held in a robot hand very accurately. We quantify localization accuracy in benchtop experiments and experimentally demonstrate the practicality of the approach in the context of a small parts insertion problem.


international conference on robotics and automation | 2015

Measurement of shear and slip with a GelSight tactile sensor

Wenzhen Yuan; Rui Li; Mandayam A. Srinivasan; Edward H. Adelson

Artificial tactile sensing is still underdeveloped, especially in sensing shear and slip on a contact surface. For a robot hand to manually explore the environment or perform a manipulation task such as grasping, sensing of shear forces and detecting incipient slip is important. In this paper, we introduce a method of sensing the normal, shear and torsional load on the contact surface with a GelSight tactile sensor [1]. In addition, we demonstrate the detection of incipient slip. The method consists of inferring the state of the contact interface based on analysis of the sequence of images of GelSights elastomer medium, whose deformation under the external load indicates the conditions of contact. Results with a robot gripper like experimental setup show that the method is effective in detecting interactions with an object during stable grasp as well as at incipient slip. The method is also applicable to other optical based tactile sensors.


international conference on robotics and automation | 2017

Shape-independent hardness estimation using deep learning and a GelSight tactile sensor

Wenzhen Yuan; Chenzhuo Zhu; Andrew Owens; Mandayam A. Srinivasan; Edward H. Adelson

Hardness is among the most important attributes of an object that humans learn about through touch. However, approaches for robots to estimate hardness are limited, due to the lack of information provided by current tactile sensors. In this work, we address these limitations by introducing a novel method for hardness estimation, based on the GelSight tactile sensor, and the method does not require accurate control of contact conditions or the shape of objects. A GelSight has a soft contact interface, and provides high resolution tactile images of contact geometry, as well as contact force and slip conditions. In this paper, we try to use the sensor to measure hardness of objects with multiple shapes, under a loosely controlled contact condition. The contact is made manually or by a robot hand, while the force and trajectory are unknown and uneven. We analyze the data using a deep constitutional (and recurrent) neural network. Experiments show that the neural net model can estimate the hardness of objects with different shapes and hardness ranging from 8 to 87 in Shore 00 scale.


Sensors | 2017

GelSight: High-Resolution Robot Tactile Sensors for Estimating Geometry and Force

Wenzhen Yuan; Siyuan Dong; Edward H. Adelson

Tactile sensing is an important perception mode for robots, but the existing tactile technologies have multiple limitations. What kind of tactile information robots need, and how to use the information, remain open questions. We believe a soft sensor surface and high-resolution sensing of geometry should be important components of a competent tactile sensor. In this paper, we discuss the development of a vision-based optical tactile sensor, GelSight. Unlike the traditional tactile sensors which measure contact force, GelSight basically measures geometry, with very high spatial resolution. The sensor has a contact surface of soft elastomer, and it directly measures its deformation, both vertical and lateral, which corresponds to the exact object shape and the tension on the contact surface. The contact force, and slip can be inferred from the sensor’s deformation as well. Particularly, we focus on the hardware and software that support GelSight’s application on robot hands. This paper reviews the development of GelSight, with the emphasis in the sensing principle and sensor design. We introduce the design of the sensor’s optical system, the algorithm for shape, force and slip measurement, and the hardware designs and fabrication of different sensor versions. We also show the experimental evaluation on the GelSight’s performance on geometry and force measurement. With the high-resolution measurement of shape and contact force, the sensor has successfully assisted multiple robotic tasks, including material perception or recognition and in-hand localization for robot manipulation.


computer vision and pattern recognition | 2017

Connecting Look and Feel: Associating the Visual and Tactile Properties of Physical Materials

Wenzhen Yuan; Shaoxiong Wang; Siyuan Dong; Edward H. Adelson

For machines to interact with the physical world, they must understand the physical properties of objects and materials they encounter. We use fabrics as an example of a deformable material with a rich set of mechanical properties. A thin flexible fabric, when draped, tends to look different from a heavy stiff fabric. It also feels different when touched. Using a collection of 118 fabric samples, we captured color and depth images of draped fabrics along with tactile data from a high-resolution touch sensor. We then sought to associate the information from vision and touch by jointly training CNNs across the three modalities. Through the CNN, each input, regardless of the modality, generates an embedding vector that records the fabrics physical property. By comparing the embedding vectors, our system is able to look at a fabric image and predict how it will feel, and vice versa. We also show that a system jointly trained on vision and touch data can outperform a similar system trained only on visual data when tested purely with visual inputs.


intelligent robots and systems | 2016

Estimating object hardness with a GelSight touch sensor

Wenzhen Yuan; Mandayam A. Srinivasan; Edward H. Adelson

Hardness sensing is a valuable capability for a robot touch sensor. We describe a novel method of hardness sensing that does not require accurate control of contact conditions. A GelSight sensor is a tactile sensor that provides high resolution tactile images, which enables a robot to infer object properties such as geometry and fine texture, as well as contact force and slip conditions. The sensor is pressed on silicone samples by a human or a robot and we measure the sample hardness only with data from the sensor, without a separate force sensor and without precise knowledge of the contact trajectory. We describe the features that show object hardness. For hemispherical objects, we develop a model to measure the sample hardness, and the estimation error is about 4% in the range of 8 Shore 00 to 45 Shore A. With this technology, a robot is able to more easily infer the hardness of the touched objects, thereby improving its object recognition as well as manipulation strategy.


international conference on robotics and automation | 2016

Fast localization and tracking using event sensors

Wenzhen Yuan; Srikumar Ramalingam

The success of many robotics applications hinges on the speed at which the underlying sensing and inference tasks are carried out. Many high-speed applications such as autonomous driving and evasive maneuvering of quadrotors require high run time performance, which traditional cameras can seldom provide. In this paper we develop a fast localization and tracking algorithm using an event sensor, which produces on the order of million asynchronous events per second at pixels where luminance changes. The events are usually fired at the high gradient pixels (edges), where luminance changes occur as the sensor moves. We develop a fast spatio-temporal binning scheme to detect lines from these events at the edges. We represent the 3D model of the world using vertical lines, and the sensor pose can be estimated using the correspondences from 2D event lines to 3D world lines. The inherent simplicity of our method enables us to achieve a run time performance of 1000 Hertz.


international conference on intelligent robotics and applications | 2010

A novel hand-gesture recognition method based on finger state projection for control of robotic hands

Wenzhen Yuan; Wenzeng Zhang

This paper proposes a novel method for hand-gesture recognition based on finger state projection, called FSP Method, which is used to control robotic hands. The control system using FSP Method can simplify the control process of robotic hand considerately while avoiding the drawbacks of traditional hand-gesture recognition methods. The FSP Method measures the projection length of fingers through a monocular vision to infer the state of fingers, and therefore the angle of each joints in fingers. The information is used to control the motors of a robotic hand and make it to pose as is wished. The experimental results show that the FSP Method is effective. The FSP Method does not need lots of study work which is almost required by traditional methods, and the FSP Method has higher adaptability and bigger recognition range than traditional methods.


Archive | 2017

Development of GelSight: A High-resolution Tactile Sensor for Measuring Geometry and Force

Wenzhen Yuan; Siyuan Dong; Edward H. Adelson

Tactile sensing is an important perception mode for robots, but the existing tactile 1 technologies have multiple limitations. What kind of tactile information robots need, and how 2 to use the information, remain open questions. We believe a soft sensor surface and high-resolution 3 sensing of geometry should be important components of a competent tactile sensor. In this paper, 4 we introduce the development of a vision-based optic tactile sensor, GelSight. Unlike the traditional 5 tactile sensors which measure contact force, GelSight basically measures geometry, with a high spatial 6 resolution from 1 to 30 microns. The sensor has a contact surface of soft elastomer, and it directly 7 measure its deformation, both vertical and lateral, which corresponds to the exact object shape and 8 the tension on the contact surface. The contact force, and slip could be inferred from the sensor’s 9 deformation as well. Particularly, we focus on the hardware and software that support GelSight’s 10 application on robot hands. This paper reviews the development of GelSight, with the emphasis 11 in the sensing principle and sensor design. We introduce the design of sensor’s optical system, 12 the algorithm for shape, force and slip measurement, and the hardware designs and fabrication of 13 different sensor versions. We also show the experimental evaluation on the GelSight’s performance on 14 geometry and force measurement. With the high-resolution measurement of shape and contact force, 15 the sensor has successfully assist multiple robotic tasks, including material perception or recognition, 16 in-hand localization for robot manipulation. 17


intelligent robots and systems | 2017

Improved GelSight tactile sensor for measuring geometry and slip

Siyuan Dong; Wenzhen Yuan; Edward H. Adelson

Collaboration


Dive into the Wenzhen Yuan's collaboration.

Top Co-Authors

Avatar

Edward H. Adelson

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Mandayam A. Srinivasan

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Siyuan Dong

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Andrew Owens

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Rui Li

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Shaoxiong Wang

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Sergey Levine

University of California

View shared research outputs
Top Co-Authors

Avatar

Robert Platt

Northeastern University

View shared research outputs
Top Co-Authors

Avatar

Roberto Calandra

Technische Universität Darmstadt

View shared research outputs
Top Co-Authors

Avatar

Chenzhuo Zhu

Massachusetts Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge