UGV direction control by human arm gesture recognition via deterministic learning
Date of Original Version
In this paper, we present a novel method for controlling an unmanned ground vehicle (UGV) by using a new machine learning technique, called deterministic learning , to learn and recognize four specifically designed body languages, which represent four corresponding moving directions (i.e., left, right, up, and down) of the controlled UGV. The Microsoft Kinect sensor is employed to collect the human body skeleton data, from which four specifically-designed features are extracted for neural network training. The discrete-time deterministic learning algorithm is then utilized to train the radial basis function neural networks (RBFNNs). The dynamics of the human arm waving motion is guaranteed to be accurately identified, represented, and stored as a RBF NN model with converged constant NN weights. In the testing phase, a set of estimators are built based on the database established in the learning phase, so as to conduct real-time rapid recognition of new in-coming gesture commands. The smallest error principle is used to decode the human intention, the decoded result will then be sent to the UGV through TCP/IP to control the UGV's moving directions. A full-integrated graphical user interface (GUI) has been developed based on Matlab to demonstrate effectiveness of the proposed approach and visualize the experimental results.
Chinese Control Conference, CCC
Chen, Xiaotian, Xiaonan Dong, Wei Zeng, Chengzhi Yuan, and Paolo Stegagno. "UGV direction control by human arm gesture recognition via deterministic learning." Chinese Control Conference, CCC 2019-July, (2019): 7688-7693. doi:10.23919/ChiCC.2019.8865788.