Boosting the Cross-Architecture Generalization of Dataset Distillation through an Empirical Study
CoRR(2023)
摘要
The poor cross-architecture generalization of dataset distillation greatly
weakens its practical significance. This paper attempts to mitigate this issue
through an empirical study, which suggests that the synthetic datasets undergo
an inductive bias towards the distillation model. Therefore, the evaluation
model is strictly confined to having similar architectures of the distillation
model. We propose a novel method of EvaLuation with distillation Feature (ELF),
which utilizes features from intermediate layers of the distillation model for
the cross-architecture evaluation. In this manner, the evaluation model learns
from bias-free knowledge therefore its architecture becomes unfettered while
retaining performance. By performing extensive experiments, we successfully
prove that ELF can well enhance the cross-architecture generalization of
current DD methods. Code of this project is at
\url{https://github.com/Lirui-Zhao/ELF}.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要