Knowledge Distillation via Instance Relationship Graph Supplementary Document

semanticscholar(2019)

引用 0|浏览0
暂无评分
摘要
Though we have verified the effectiveness of LIRG-t by comparing the performance of LIRG and LMTK , the performance of single LIRG-t is not shown due to the limitation of space. Therefore, in this section, we train the student network only with LIRG-t, and analyze the performance. In the experiment, ResNet20 and ResNet20-x0.5 are adopted as the teacher network and the student network, respectively. CIFAR10 is used for training and validation. In addition, besides LIRG, FSP [2] is selected as a competing method, since FSP also distills knowledge from the overall inference procedure.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要