UGV direction control by human arm gesture recognition via deterministic learning

Document Type

Conference Proceeding

Date of Original Version

7-1-2019

Abstract

In this paper, we present a novel method for controlling an unmanned ground vehicle (UGV) by using a new machine learning technique, called deterministic learning [1], to learn and recognize four specifically designed body languages, which represent four corresponding moving directions (i.e., left, right, up, and down) of the controlled UGV. The Microsoft Kinect sensor is employed to collect the human body skeleton data, from which four specifically-designed features are extracted for neural network training. The discrete-time deterministic learning algorithm is then utilized to train the radial basis function neural networks (RBFNNs). The dynamics of the human arm waving motion is guaranteed to be accurately identified, represented, and stored as a RBF NN model with converged constant NN weights. In the testing phase, a set of estimators are built based on the database established in the learning phase, so as to conduct real-time rapid recognition of new in-coming gesture commands. The smallest error principle is used to decode the human intention, the decoded result will then be sent to the UGV through TCP/IP to control the UGV's moving directions. A full-integrated graphical user interface (GUI) has been developed based on Matlab to demonstrate effectiveness of the proposed approach and visualize the experimental results.

Publication Title, e.g., Journal

Chinese Control Conference, CCC

Volume

2019-July

Share

COinS