A new discrete-continuous algorithm for radial basis function networks construction

Document Type

Article

Date of Original Version

6-28-2013

Abstract

The construction of a radial basis function (RBF) network involves the determination of the model size, hidden nodes, and output weights. Least squares-based subset selection methods can determine a RBF model size and its parameters simultaneously. Although these methods are robust, they may not achieve optimal results. Alternatively, gradient methods are widely used to optimize all the parameters. The drawback is that most algorithms may converge slowly as they treat hidden nodes and output weights separately and ignore their correlations. In this paper, a new discrete-continuous algorithm is proposed for the construction of a RBF model. First, the orthogonal least squares (OLS)-based forward stepwise selection constructs an initial model by selecting model terms one by one from a candidate term pool. Then a new Levenberg-Marquardt (LM)-based parameter optimization is proposed to further optimize the hidden nodes and output weights in the continuous space. To speed up the convergence, the proposed parameter optimization method considers the correlation between the hidden nodes and output weights, which is achieved by translating the output weights to dependent parameters using the OLS method. The correlation is also used by the previously proposed continuous forward algorithm (CFA). However, unlike the CFA, the new method optimizes all the parameters simultaneously. In addition, an equivalent recursive sum of squared error is derived to reduce the computation demanding for the first derivatives used in the LM method. Computational complexity is given to confirm the new method is much more computationally efficient than the CFA. Different numerical examples are presented to illustrate the effectiveness of the proposed method. Further, Friedman statistical tests on 13 classification problems are performed, and the results demonstrate that RBF networks built by the new method are very competitive in comparison with some popular classifiers. © 2012 IEEE.

Publication Title, e.g., Journal

IEEE Transactions on Neural Networks and Learning Systems

Volume

24

Issue

11

Share

COinS