Distilling Object Detectors with Task Adaptive Regularization

Ruoyu Sun
Ruoyu Sun
Fuhui Tang
Fuhui Tang
Xiaopeng Zhang
Xiaopeng Zhang
Cited by: 0|Bibtex|Views18|Links

Abstract:

Current state-of-the-art object detectors are at the expense of high computational costs and are hard to deploy to low-end devices. Knowledge distillation, which aims at training a smaller student network by transferring knowledge from a larger teacher model, is one of the promising solutions for model miniaturization. In this paper, we...More

Code:

Data:

Your rating :
0

 

Tags
Comments