Unifying distillation and privileged information

international conference on learning representations(2015)

引用 481|浏览106
暂无评分
摘要
Abstract: Distillation (Hinton et al., 2015) and privileged information (Vapnik u0026 Izmailov, 2015) are two techniques that enable machines to learn from other machines. This paper unifies these two techniques into generalized distillation, a framework to learn from multiple machines and data representations. We provide theoretical and causal insight about the inner workings of generalized distillation, extend it to unsupervised, semisupervised and multitask learning scenarios, and illustrate its efficacy on a variety of numerical simulations on both synthetic and real-world data.
更多
查看译文
关键词
distillation,privileged information
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要