Joint learning adaptive metric and optimal classification hyperplane

Neural Networks(2022)

引用 10|浏览10
暂无评分
摘要
Metric learning has attracted a lot of interest in classification tasks due to its efficient performance. Most traditional metric learning methods are based on k-nearest neighbors (kNN) classifiers to make decisions, while the choice k affects the generalization. In this work, we propose an end-to-end metric learning framework. Specifically, a new linear metric learning (LMML) is first proposed to jointly learn adaptive metrics and the optimal classification hyperplanes, where dissimilar samples are separated by maximizing classification margin. Then a nonlinear metric learning model (called RLMML) is developed based on a bound nonlinear kernel function to extend LMML. The non-convexity of the proposed models makes them difficult to optimize. The half-quadratic optimization algorithms are developed to solve iteratively the problems, by which the optimal classification hyperplane and adaptive metric are alternatively optimized. Moreover, the resulting algorithms are proved to be convergent theoretically. Numerical experiments on different types of data sets show the effectiveness of the proposed algorithms. Finally, the Wilcoxon test shows also the feasibility and effectiveness of the proposed models.
更多
查看译文
关键词
Metric learning,Optimal classification hyperplane,Maximum margin classification,Correntropy,Half quadratic optimization algorithm
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要