Knowledge Squeezed Adversarial Network Compression

    Changyong Shu
    Changyong Shu
    Peng Li
    Peng Li
    Longquan Dai
    Longquan Dai
    Lizhuang Ma
    Lizhuang Ma

    arXiv: Learning, 2019.

    Cited by: 0|Bibtex|Views12|Links
    EI

    Abstract:

    Deep network compression has been achieved notable progress via knowledge distillation, where a teacher-student learning manner is adopted by using predetermined loss. Recently, more focuses have been transferred to employ the adversarial training to minimize the discrepancy between distributions of output from two networks. However, they...More

    Code:

    Data:

    Your rating :
    0

     

    Tags
    Comments