Redirigiendo al acceso original de articulo en 21 segundos...
Inicio  /  Algorithms  /  Vol: 16 Par: 10 (2023)  /  Artículo
ARTÍCULO
TITULO

Separable Gaussian Neural Networks: Structure, Analysis, and Function Approximations

Siyuan Xing and Jian-Qiao Sun    

Resumen

The Gaussian-radial-basis function neural network (GRBFNN) has been a popular choice for interpolation and classification. However, it is computationally intensive when the dimension of the input vector is high. To address this issue, we propose a new feedforward network-separable Gaussian neural network (SGNN) by taking advantage of the separable property of Gaussian-radial-basis functions, which splits input data into multiple columns and sequentially feeds them into parallel layers formed by uni-variate Gaussian functions. This structure reduces the number of neurons from O(Nd)" role="presentation">??(????)O(Nd) O ( N d ) of GRBFNN to O(dN)" role="presentation">??(????)O(dN) O ( d N ) , which exponentially improves the computational speed of SGNN and makes it scale linearly as the input dimension increases. In addition, SGNN can preserve the dominant subspace of the Hessian matrix of GRBFNN in gradient descent training, leading to a similar level of accuracy to GRBFNN. It is experimentally demonstrated that SGNN can achieve an acceleration of 100 times with a similar level of accuracy over GRBFNN on tri-variate function approximations. The SGNN also has better trainability and is more tuning-friendly than DNNs with RuLU and Sigmoid functions. For approximating functions with a complex geometry, SGNN can lead to results that are three orders of magnitude more accurate than those of a RuLU-DNN with twice the number of layers and the number of neurons per layer.

 Artículos similares