Deterministic learning from sampling data
Date of Original Version
In this paper, based on the deterministic learning mechanism, we present an alternative systematic scheme for dynamics identification from sampling data sequences. The proposed scheme belongs to a dynamical machine learning framework based on Lyapunov stability theory rather than optimization estimation approaches. Given a sampling data sequence collected from an unknown deterministic nonlinear dynamical system, we show that the inherent dynamics of the sampling data sequence can be locally accurately captured and represented as a converged constant neural network. This accurate identification result can be derived from the exponential stability of a class of linear time-varying (LTV) error systems usually appearing in adaptive control and identification fields. However, there are few existing results concerning the exponential stability problem of the discrete-time LTV case. To this end, by using the small gain theorem of input-to-state stability (ISS), we provide a rigorous proof for establishing the exponential stability of the LTV system under the persistent excitation (PE) condition. Based on the exponential stability results, the exponential convergence of neural weights to their optimal values can be analytically guaranteed, which leads to the achievement of the accurate dynamics identification. The significance of the paper is that we preliminarily explore the essential problem of machine learning of dynamical systems by innovatively transforming tools and methodologies from dynamic system stability and control theory. Numerical simulation results are provided to validate the effectiveness of the proposed scheme with comparison to previously mentioned approaches.
Wu, Weiming, Cong Wang, and Chengzhi Yuan. "Deterministic learning from sampling data." Neurocomputing 358, (2019): 456-466. doi:10.1016/j.neucom.2019.05.044.