论文部分内容阅读
求解支撑向量机的二次规划有不同的变形.对于线性问题.从一个变形出发,利用 Lagrangian 对偶技巧,将特征空间的高维二次规划问题转化为输入空间的低维无约束、可微凸的对偶规划.针对目标函数的分片二次特征,结合快速精确的一维搜索技术,提出共轭梯度型支撑向量机来求解该问题.利用 Cholesky 分解或非完全(in-complete)Cholesky 分解方法分解核矩阵,在算法复杂度增加很少的条件下可实现基于核函数的非线性分类.该算法可以在普通计算机上快速求解上百万规模的线性训练问题和较大规模的非线性训练问题.大量数据实验和复杂度分析表明,该算法与同类算法如 ASVM、LSVM 相比是有效的.
For the linear problem, a Lagrangian dual technique is used to transform the high-order quadratic programming problem of feature space into a low-dimensional unconstrained input space, which can be slightly convex The paper proposes a conjugate gradient support vector machine to solve this problem.According to the patch quadratic feature of the objective function and the fast and accurate one-dimensional search technique, a conjugate gradient support vector machine is proposed to solve the problem. Cholesky decomposition or in-complete Cholesky decomposition Decomposes the kernel matrix and achieves nonlinear classification based on kernel function under the condition of little increase of algorithm complexity.The algorithm can quickly solve the millions of linear training problems and the large-scale nonlinear training problems on a common computer A large number of data experiments and complexity analysis show that the proposed algorithm is more effective than similar algorithms such as ASVM and LSVM.