2021 Aerial Robotic Systems Physically Interacting with the Environment (AIRPHARO) | 2021

An Overview of Hand Gesture Languages for Autonomous UAV Handling

 
 
 

Abstract


Camera-equipped Unmanned Aerial Vehicles (UAVs, or drones) have revolutionized several application domains, with a steadily increasing degree of cognitive autonomy in commercial drones paving the way for unprecedented robotization of daily life. Dynamic cooperation of UAV s with human collaborators is typically necessary during a mission; a fact that has led to various solutions for high-level UAV-operator interaction. Hand gestures are an effective way of facilitating this remote drone handling, giving rise to new gesture languages for visual communication between operators and autonomous UAV s. This paper reviews all the available languages which could be used or have been created for this purpose, as well as relevant gesture recognition datasets for training machine learning models. Moreover, a novel, generic, base gesture language for handling camera-equipped UAV s is proposed, along with a corresponding, large-scale, publicly available video dataset. The presented language can easily and consistently be extended in the future to more specific scenarios/profiles, tailored for particular application domains and/or additional UAV equipment (e.g., aerial manipulators/arms). Finally, we evaluate: a) the performance of state-of-the-art gesture recognition algorithms on the proposed dataset, in a quantitative and objective manner, and b) the intuitiveness, effectiveness and completeness of the proposed gesture language, in a qualitative and subjective manner.

Volume None
Pages 1-7
DOI 10.1109/AIRPHARO52252.2021.9571027
Language English
Journal 2021 Aerial Robotic Systems Physically Interacting with the Environment (AIRPHARO)

Full Text