论文部分内容阅读
序贯最小优化(sequential minimal optimization,简称SMO)算法是目前解决大量数据下支持向量机(support vector machine,简称SVM)训练问题的一种十分有效的方法,但是确定工作集的可行方向策略会降低缓存的效率.给出了SMO的一种可行方向法的解释,进而提出了一种收益代价平衡的工作集选择方法,综合考虑与工作集相关的目标函数的下降量和计算代价,以提高缓存的效率.实验结果表明,该方法可以提高SMO算法的性能,缩短SVM分类器的训练时间,特别适用于样本较多、支持向量较多、非有界支持向量较多的情况.
Sequential minimal optimization (SMO) algorithm is a very effective method to solve the training problem of support vector machine (SVM) under massive data. However, the feasible direction strategy of determining the working set will be reduced This paper presents a feasible direction method of SMO, and then proposes a working set selection method that balances the cost of revenue. Considering the reduction of the objective function and the calculation cost associated with the working set, The experimental results show that this method can improve the performance of SMO algorithm and shorten the training time of SVM classifier, especially for the case of more samples, more support vectors and more non-bounded support vectors.