Feature Transformation with Class Conditional Decorrelation
ICDM(2013)
摘要
The well-known feature transformation model of Fisher linear discriminant analysis (FDA) can be decomposed into an equivalent two-step approach: whitening followed by principal component analysis (PCA) in the whitened space. By proving that whitening is the optimal linear transformation to the Euclidean space in the sense of minimum log-determinant divergence, we propose a transformation model called class conditional decor relation (CCD). The objective of CCD is to diagonalize the covariance matrices of different classes simultaneously, which is efficiently optimized using a modified Jacobi method. CCD is effective to find the common principal components among multiple classes. After CCD, the variables become class conditionally uncorrelated, which will benefit the subsequent classification tasks. Combining CCD with the nearest class mean (NCM) classification model can significantly improve the classification accuracy. Experiments on 15 small-scale datasets and one large-scale dataset (with 3755 classes) demonstrate the scalability of CCD for different applications. We also discuss the potential applications of CCD for other problems such as Gaussian mixture models and classifier ensemble learning.
更多查看译文
关键词
optimisation,euclidean space,classifier ensemble learning,equivalent two-step approach,nearest class mean classification model,pattern classification,modified jacobi method,covariance matrices,class conditional decorrelation,ccd,fisher linear discriminant analysis,simultaneous diagonalization,feature transformation model,large-scale dataset,principal component analysis,feature transformation,decorrelation,jacobian matrices,gaussian mixture model,minimum log-determinant divergence,optimal linear transformation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络