论文部分内容阅读
In this paper, we propose enhancements to Beetle Antennae search (BAS) algorithm, called BAS-ADAM, to smoothen the convergence behavior and avoid trapping in local-minima for a highly non-convex objective function. We achieve this by adaptively adjusting the step-size in each iteration using the adaptive moment estimation (ADAM) update rule. The pro-posed algorithm also increases the convergence rate in a narrow valley. A key feature of the ADAM update rule is the ability to adjust the step-size for each dimension separately instead of us-ing the same step-size. Since ADAM is traditionally used with gradient-based optimization algorithms, therefore we first pro-pose a gradient estimation model without the need to differenti-ate the objective function. Resultantly, it demonstrates excellent performance and fast convergence rate in searching for the op-timum of non-convex functions. The efficiency of the proposed al-gorithm was tested on three different benchmark problems, in-cluding the training of a high-dimensional neural network. The performance is compared with particle swarm optimizer (PSO) and the original BAS algorithm.