Knowledge distillation for secondary pulmonary tuberculosis classification ensemble.

UCC Companion(2021)

引用 1|浏览1
暂无评分
摘要
This paper focuses on a teacher-student scheme for knowledge distillation of a secondary pulmonary tuberculosis classification ensemble. As ensemble learning combines multiple neural networks, the combined ensemble often requires inference from each base network. Therefore, one of the challenges for ensemble learning is its size and efficiency in inference. This paper proposes knowledge distillation for ensemble learning via a teacher-student scheme, where a single noised student learns the concatenated representations generated by each base network. Comparing the ensemble of teacher networks and the single student, we showed that, with a performance penalty, the ensemble size and computational cost are significantly reduced.
更多
查看译文
关键词
Knowledge distillation, Teacher-student, Ensemble learning, Secondary pulmonary tuberculosis
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要