Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Hesheng Wang is active.

Publication


Featured researches published by Hesheng Wang.


IEEE Transactions on Robotics | 2006

Uncalibrated visual servoing of robots using a depth-independent interaction matrix

Yun-Hui Liu; Hesheng Wang; Chengyou Wang; Kin Kwan Lam

This paper presents a new adaptive controller for image-based dynamic control of a robot manipulator using a fixed camera whose intrinsic and extrinsic parameters are not known. To map the visual signals onto the joints of the robot manipulator, this paper proposes a depth-independent interaction matrix, which differs from the traditional interaction matrix in that it does not depend on the depths of the feature points. Using the depth-independent interaction matrix makes the unknown camera parameters appear linearly in the closed-loop dynamics so that a new algorithm is developed to estimate their values on-line. This adaptive algorithm combines the Slotine-Li method with on-line minimization of the errors between the real and estimated projections of the feature points on the image plane. Based on the nonlinear robot dynamics, we prove asymptotic convergence of the image errors to zero by the Lyapunov theory. Experiments have been conducted to verify the performance of the proposed controller. The results demonstrated good convergence of the image errors


IEEE Transactions on Robotics | 2008

Adaptive Visual Servoing Using Point and Line Features With an Uncalibrated Eye-in-Hand Camera

Hesheng Wang; Yun-Hui Liu; Dongxiang Zhou

This paper presents a novel approach for image-based visual servoing of a robot manipulator with an eye-in-hand camera when the camera parameters are not calibrated and the 3-D coordinates of the features are not known. Both point and line features are considered. This paper extends the concept of depth-independent interaction (or image Jacobian) matrix, developed in earlier work for visual servoing using point features and fixed cameras, to the problem using eye-in-hand cameras and point and line features. By using the depth-independent interaction matrix, it is possible to linearly parameterize, by the unknown camera parameters and the unknown coordinates of the features, the closed-loop dynamics of the system. A new algorithm is developed to estimate unknown parameters online by combining the Slotine-Li method with the idea of structure from motion in computer vision. By minimizing the errors between the real and estimated projections of the feature on multiple images captured during motion of the robot, this new adaptive algorithm can guarantee the convergence of the estimated parameters to the real values up to a scale. On the basis of the nonlinear robot dynamics, we proved asymptotic convergence of the image errors by the Lyapunov theory. Experiments have been conducted to demonstrate the performance of the proposed controller.


IEEE Transactions on Robotics | 2007

Dynamic Visual Tracking for Manipulators Using an Uncalibrated Fixed Camera

Hesheng Wang; Yun-Hui Liu; Dongxiang Zhou

This paper presents a new controller for controlling a number of feature points on a robot manipulator to trace desired trajectories specified on the image plane of a fixed camera. It is assumed that the intrinsic and extrinsic parameters of the camera are not calibrated. A new adaptive algorithm is developed to estimate the unknown parameters online, based on three original ideas. First, we use the pseudoinverse of the depth-independent interaction matrix to map the image errors onto the joint space of the manipulator. By eliminating the depths in the interaction matrix, we can linearly parameterize the closed-loop dynamics of the manipulator. Second, to guarantee the existence of the pseudoinverse, the adaptive algorithm introduces a potential force to drive the estimated parameters away from the values that result in a singular Jacobian matrix. Third, to ensure that the estimated parameters are convergent to their true values up to a scale, we combine the Slotine-Li method with an online algorithm for minimizing the error between the estimated projections and real image coordinates of the feature points. We have proved asymptotic convergence of the image errors to zero by the Lyapunov theory based on the nonlinear robot dynamics. Experiments have been carried out to verify the performance of the proposed controller.


IEEE-ASME Transactions on Mechatronics | 2011

A New Approach to Dynamic Eye-in-Hand Visual Tracking Using Nonlinear Observers

Hesheng Wang; Yun-Hui Liu; Weidong Chen; Zhongli Wang

This paper presents a new controller for locking a moving object in 3-D space at a particular position (for example, the center) on the image plane of a camera mounted on a robot by actively moving the camera. The controller is designed to cope with both the highly nonlinear robot dynamics and unknown motion of the object. Based on the fact that the unknown position of the moving object appears linearly in the closed-loop dynamics of the system if the depth-independent image Jacobian is used, we developed a nonlinear observer to estimate the 3-D motion of the object online. With a full consideration of dynamic responses of the robot manipulator, we employ the Lyapunov method to prove asymptotic convergence of the image errors. Experimental results are used to demonstrate the performance of the proposed approach.


IEEE Transactions on Control Systems and Technology | 2010

Uncalibrated Visual Tracking Control Without Visual Velocity

Hesheng Wang; Yun-Hui Liu; Weidong Chen

This paper presents a new adaptive controller for image-based tracking of a robot manipulator without using visual velocity when the intrinsic and extrinsic parameters of the camera are not calibrated. Most of existing controllers require the measurement of the visual velocity, which is subject to big noises in general due to low sampling rates of the vision loop. To avoid performance decaying caused by measurement errors of the visual velocity we propose a new image-based tracking controller based on the proposal of an estimator of visual velocity. With a full consideration of dynamic responses of the robot manipulator, we proved the convergence of the image errors of the trajectory to zero and the convergence of the estimated parameters to the real values up to a scale by the Lyapunov method. Experiments have been conducted to demonstrate good convergence of the trajectory errors under the control of the proposed method.


intelligent robots and systems | 2013

Visual servo control of cable-driven soft robotic manipulator

Hesheng Wang; Weidong Chen; Xiaojin Yu; Tao Deng; Xiaozhou Wang; Rolf Pfeifer

Aim at enhancing dexterous and safe operation in unstructured environment, a cable-driven soft robotic manipulator is designed in this paper. Due to soft material it made of and nearly infinite degree of freedom it owns, the soft robotic manipulator has higher security and dexterity than traditional rigid-link manipulator, which make it suitable to perform tasks in complex environments that is narrow, confined and unstructured. Though the soft robotic manipulator possesses advantages above, it is not an easy thing for it to achieve precise position control. In order to solve this problem, a kinematic model based on piecewise constant curvature hypothesis is proposed. Through building up three spaces and two mappings, the relationship between the length variables of 4 cables and the position and orientation of the soft robotic manipulator end-effector is obtained. Afterwards, a depth-independent image Jacobian matrix is introduced and an image-based visual servo controller is presented. Applied by adaptive algorithm, the controller could estimate unknown position of the feature point online, and then Lyapunov theory is used to prove the stability of the proposed controller. At last, experiments are conducted to demonstrate rationality and validity of the kinematic model and adaptive visual servo controller.


IEEE Transactions on Control Systems and Technology | 2015

Adaptive Image-Based Trajectory Tracking Control of Wheeled Mobile Robots With an Uncalibrated Fixed Camera

Xinwu Liang; Hesheng Wang; Weidong Chen; Dejun Guo; Tao Liu

In this paper, the uncalibrated image-based trajectory tracking control problem of wheeled mobile robots will be studied. The motion of the wheeled mobile robot can be observed using an uncalibrated fixed camera on the ceiling. Different from traditional vision-based control strategies of wheeled mobile robots in the fixed camera configuration, the camera image plane is not required to be parallel to the motion plane of the wheeled mobile robots and the camera can be placed at a general position. To guarantee that the wheeled mobile robot can efficiently track its desired trajectory, which is specified by the desired image trajectory of a feature point at the forward axis of the wheeled mobile robot, we will propose a new adaptive image-based trajectory tracking control approach without the exact knowledge of the camera intrinsic and extrinsic parameters and the position parameter of the feature point. To eliminate the nonlinear dependence on the unknown parameters from the closed-loop system, a depth-independent image Jacobian matrix framework for the wheeled mobile robots will be developed such that unknown parameters in the closed-loop system can be linearly parameterized. In this way, adaptive laws can be designed to estimate the unknown parameters online, and the depth information of the feature point can be allowed to be time varying in this case. The Lyapunov stability analysis will also be performed to show asymptotical convergence of image position and velocity tracking errors of the wheeled mobile robot. The simulation results based on a two-wheeled mobile robot will be given in this paper to illustrate the performance of the proposed approach as well. The experimental results based on a real wheeled mobile robot will also be provided to validate the proposed approach.


IEEE-ASME Transactions on Mechatronics | 2017

Visual Servoing of Soft Robot Manipulator in Constrained Environments With an Adaptive Controller

Hesheng Wang; Bohan Yang; Yuting Liu; Weidong Chen; Xinwu Liang; Rolf Pfeifer

It is unavoidable for a soft manipulator to interact with environments during some tasks. These interactions may affect the soft manipulator and make the kinematic model different from the one in free space, e.g., the soft manipulators effective length and the target positions might change. In order to apply the soft manipulator to constrained environments, an adaptive visual servo controller based on piecewise-constant curvature kinematic, without knowing the true values of the manipulators length and the target positions, is proposed in this paper. Experimental results in the free space, constrained environment, and the gravity-influenced environment, demonstrate the convergence of the image errors under the proposed controller.


international conference on robotics and automation | 2006

Uncalibrated visual tracking control without visual velocity

Hesheng Wang; Yun-Hui Liu

This paper presents a new adaptive controller for dynamic tracking of a robot manipulator without visual velocity when the intrinsic and extrinsic parameters of the camera are not calibrated. Most controllers in the past require the measurement of the visual velocity or differentiation of the visual position. The measurement of the visual velocity is subject to big noises in general due to low sampling rates of the vision loop. To avoid performance decaying caused by measurement errors of the visual velocity, the controller we developed requires estimated visual velocity only. With a full consideration of dynamic responses of the robot manipulator, we employed the Lyapunov method to prove the convergence of the image errors of the trajectory to zero and the convergence of the estimated parameters to the real values up to a scale. Experiments have been conducted to demonstrate good convergence of the trajectory errors of the robot under the control of the proposed method


IEEE Transactions on Systems, Man, and Cybernetics | 2016

Adaptive Task-Space Cooperative Tracking Control of Networked Robotic Manipulators Without Task-Space Velocity Measurements

Xinwu Liang; Hesheng Wang; Yun-Hui Liu; Weidong Chen; Guoqiang Hu; Jie Zhao

In this paper, the task-space cooperative tracking control problem of networked robotic manipulators without task-space velocity measurements is addressed. To overcome the problem without task-space velocity measurements, a novel task-space position observer is designed to update the estimated task-space position and to simultaneously provide the estimated task-space velocity, based on which an adaptive cooperative tracking controller without task-space velocity measurements is presented by introducing new estimated task-space reference velocity and acceleration. Furthermore, adaptive laws are provided to cope with uncertain kinematics and dynamics and rigorous stability analysis is given to show asymptotical convergence of the task-space tracking and synchronization errors in the presence of communication delays under strongly connected directed graphs. Simulation results are given to demonstrate the performance of the proposed approach.

Collaboration


Dive into the Hesheng Wang's collaboration.

Top Co-Authors

Avatar

Weidong Chen

Shanghai Jiao Tong University

View shared research outputs
Top Co-Authors

Avatar

Yun-Hui Liu

The Chinese University of Hong Kong

View shared research outputs
Top Co-Authors

Avatar

Xinwu Liang

Shanghai Jiao Tong University

View shared research outputs
Top Co-Authors

Avatar

Jingchuan Wang

Shanghai Jiao Tong University

View shared research outputs
Top Co-Authors

Avatar

Zhe Liu

Shanghai Jiao Tong University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Dongxiang Zhou

National University of Defense Technology

View shared research outputs
Top Co-Authors

Avatar

Jun Guo Lu

Shanghai Jiao Tong University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Xiaozhou Wang

Shanghai Jiao Tong University

View shared research outputs
Researchain Logo
Decentralizing Knowledge