Distilling Image Dehazing With Heterogeneous Task Imitation

CVPR, pp. 3459-3468, 2020.

Cited by: 0|Bibtex|Views281|Links
EI

Abstract:

State-of-the-art deep dehazing models are often difficult in training. Knowledge distillation paves a way to train a student network assisted by a teacher network. However, most knowledge distill methods are used for image classification and segmentation as well as object detection, and few investigate distilling image restoration and use...More

Code:

Data:

Your rating :
0

 

Tags
Comments