You Guoqiao, Liu Manxi, Ke Yilong
Radial basis function neural network (RBFNN) is a method applied to interpolation and classification prediction. In this article, we propose an improved algorithm for the RBFNN, based on the singular value decomposition (SVD) technique, in order to greatly simplify the network structure. In particular, the proposed algorithm is able to automatically choose core neurons in the hidden layer, while deleting redundant ones, which can therefore save the CPU memory and computational cost. Meanwhile, we propose to use the $K$-fold cross validation method to determine the radial parameter $\varepsilon$ in RBF, to keep the algorithm accuracy. More importantly, there is no need to load all the sample data into the CPU memory. Instead, we propose to load and deal with the sample data row by row, based on the approximate SVD algorithm proposed by Halko in [2]. All numerical experiments show that, our proposed algorithm greatly improve the computational efficiency and simplify the RBFNN structure, compared to the traditional RBFNN, while not losing the computational accuracy.