Human Robot Interaction on Gesture Control Drone: Methods of Gesture Action Interaction

Loading...
Thumbnail Image

Files

Publication or External Link

Date

2018

Citation

Abstract

Today, the interaction between robots and human is mostly based on remote controller. However this interaction could be more natural, just like we humans interact with each others through speech, body movements, facial expressions, and so on. We propose gesture and body language as an alternative to interact with robots, particularly Unmanned Aerial Vehicles(UAVs) also known as drones. In this dissertation, we developed action recognition methods for the interaction with drones. Specically, we developed approaches to recognize human gestures for the communication with drones. Automatic detection and classification of dynamic actions in real-world system intended for human robot interaction is challenging because: 1) there is large variation in how people perform actions, making detection and classification difficult; 2) the system must work online in order to avoid a noticeable delay in the interaction between the human and the drone. In this work, we address these challenges through the combination of a real-time skeleton detection library and deep learning techniques. Our methods perform dynamic actions or gestures classification from skeleton data.

Notes

Rights