论文部分内容阅读
通过提出一种带非线性扩展的前向神经网络模型,分析了 G G B P 算法的收敛性,总结出此种算法的动态学习率.仿真结果表明:此神经网络模型更适合于处理多输入、多输出的问题,在这方面其收敛速度、逼近非线性函数的能力比函数型连接网络和前向网络都优越.采用动态学习率不仅可以保证网络的收敛性,而且可以使误差下降接近最快.
By proposing a forward neural network model with nonlinear expansion, the convergence of G G B P algorithm is analyzed and the dynamic learning rate of this algorithm is summarized. The simulation results show that this neural network model is more suitable for dealing with the problem of multiple inputs and multiple outputs. In this respect, the convergence speed and the ability of approximating nonlinear functions are superior to those of functional connection networks and forward networks. Adopting dynamic learning rate can not only ensure the convergence of the network, but also make the error decrease nearly as fast as possible.