Pareto-Path Multitask Multiple Kernel Learning

Neural Networks and Learning Systems, IEEE Transactions  (2015)

引用 24|浏览44
暂无评分
摘要
A traditional and intuitively appealing Multitask Multiple Kernel Learning (MT-MKL) method is to optimize the sum (thus, the average) of objective functions with (partially) shared kernel function, which allows information sharing among the tasks. We point out that the obtained solution corresponds to a single point on the Pareto Front (PF) of a multiobjective optimization problem, which considers the concurrent optimization of all task objectives involved in the Multitask Learning (MTL) problem. Motivated by this last observation and arguing that the former approach is heuristic, we propose a novel support vector machine MT-MKL framework that considers an implicitly defined set of conic combinations of task objectives. We show that solving our framework produces solutions along a path on the aforementioned PF and that it subsumes the optimization of the average of objective functions as a special case. Using the algorithms we derived, we demonstrate through a series of experimental results that the framework is capable of achieving a better classification performance, when compared with other similar MTL approaches.
更多
查看译文
关键词
Pareto optimisation,learning (artificial intelligence),pattern classification,support vector machines,MT-MKL method,MTL problem,PF,Pareto front,Pareto-path multitask multiple kernel learning,concurrent optimization,information sharing,multiobjective optimization problem,partially shared kernel function,support vector machine MT-MKL framework,Machine learning,optimization methods,pattern recognition,supervised learning,support vector machines (SVM),support vector machines (SVM).
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要