论文部分内容阅读
针对传统特征选择算法采用单一度量的方式难以兼顾泛化性能和降维性能的不足,提出新的特征选择算法(least squares support vector machines and fuzzy supplementary criterion,LS-SVM-FSC)。通过核化的最小二乘支持向量机(least squares support vector machines,LS-SVM)对每个特征的样本进行分类,使用新的模糊隶属度函数获得每个样本对其所属类的模糊隶属度,使用模糊补准则选择具有最小冗余最大相关的特征子集。试验表明:与其他10个特征选择方法与7个隶属度决定方法相比,所提算法在9个数据集上都具有很高的分类准确率和很强的降维性能,且在高维数据集中的学习速度依然很快。
In order to overcome the shortcomings of generalized performance and dimensionality reduction, a new feature selection algorithm (LS-SVM-FSC) is proposed to solve the traditional feature selection algorithm using a single metric. The samples of each feature were classified by the least-squares support vector machines (LS-SVM) of nucleation, the fuzzy membership degree of each sample was obtained by using the new fuzzy membership function, Use fuzzy complementation to select the subset of features that has the least redundancy and maximum correlation. Experiments show that compared with the other 10 feature selection methods and 7 membership determination methods, the proposed algorithm has high classification accuracy and strong dimensionality reduction performance on all nine data sets. Concentrated learning is still fast.