论文部分内容阅读
基于KL散度的大规模变分高斯过程分类算法(KLSP)需要同时对诱导变量的均值向量和协方差矩阵进行优化,这会给模型求解带来一定的挑战.基于拉普拉斯方法建立一种改进算法:首先为诱导变量的后验分布构造一个易于计算的下界;然后利用拉普拉斯方法计算该下界的一个高斯逼近作为诱导变量的后验分布函数的近似表达式,将问题转换为一个只与均值向量有关的凸优化问题,从而降低了模型的求解难度.仿真实验结果表明,所提出的改进算法在速度和精度上都较原始算法有了明显提高.
The large-scale variational Gaussian process classification (KLSP) algorithm based on KL divergence needs to optimize both the mean vector and the covariance matrix of the induced variables, which will bring certain challenges to the model solving.Labus’s method is used to establish a Firstly, construct a lower bound which is easy to calculate for the posterior distribution of the induced variables; then use Laplace’s method to calculate a Gaussian approximation of the lower bound as an approximate expression of the posterior distribution function of the induced variable, and convert the problem into A convex optimization problem which is only related to the mean vector, thus reducing the difficulty of solving the model.The simulation results show that the proposed algorithm has higher speed and accuracy than the original algorithm.