IEEE/ASME Transactions on Mechatronics | 2019

A Human-Following Mobile Robot Providing Natural and Universal Interfaces for Control With Wireless Electronic Devices

 
 

Abstract


This article presents the development of a mobile robot using a human-following algorithm based on human path prediction and provides a natural gesture interaction for an operator to control wireless electronic devices using a low-cost red-green-blue-depth (RGB-D) sensor. The overall experimental setup consists of a skid-steered mobile robot, Kinect sensor, laptop, red-green-blue (RGB) camera, and two lamps. OpenNI middleware is used to process the depth data from the Kinect sensor, and OpenCV is used to process data from the RGB camera. The human-following control system consists of two feedback control loops for linear and rotational motions, respectively. A lead-lag and proportional-integral derivative controllers are developed for the linear and rotational motion control loops, respectively. There are small delays (0.3\xa0s for linear motion and 0.2\xa0s for rotational motion) of the system s response. However, the delays are acceptable since they do not cause the tracking distance or angle out of our desirable range (±0.05\xa0m and ±10° of the reference input). A new human-position prediction algorithm based on human orientation is proposed for human following. Experimental results show that the tracking algorithm reduces the distance and angular errors by 40% and 50%, respectively. There are four gestures designed for the operator to control the robot. Success rates of gesture recognition are more than 90% within the detectable range of the Kinect sensor.

Volume 24
Pages 2377-2385
DOI 10.1109/TMECH.2019.2936395
Language English
Journal IEEE/ASME Transactions on Mechatronics

Full Text