Feature subset selection using naive Bayes for text classification

Pattern Recognition Letters(2015)

引用 105|浏览79
暂无评分
摘要
The global selection index can be determined from the local selection indexes.The local selection index can be calculated in its own dimension.The prediction function can be factorized.The NB models can be selectively pruned by thresholding the LSIs.Feature selection and weighting work hand-in-hand to improve classification. Feature subset selection is known to improve text classification performance of various classifiers. The model using the selected features is often regarded as if it had generated the data. By taking its uncertainty into account, the discrimination capabilities can be measured by a global selection index (GSI), which can be used in the prediction function. In this paper, we propose a latent selection augmented naive (LSAN) Bayes classifier. By introducing a latent feature selection indicator, the GSI can be factorized into each local selection index (LSI). Using conjugate priors, the LSI for feature evaluation can be explicitly calculated. Then the feature subset selection models can be pruned by thresholding the LSIs, and the LSAN classifier can be achieved by the product of a small percentage of single feature model averages. The numerical results on some real datasets show that the proposed method outperforms the contrast feature weighting methods, and is very competitive if compared with some other commonly used classifiers such as SVM.
更多
查看译文
关键词
Bayesian model averaging,Global selection index,Latent selection augmented naive Bayes,Local selection index,Text classification
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要