BigNAS: Scaling Up Neural Architecture Search with Big Single-Stage Models
european conference on computer vision, pp. 702-717, 2020.
We presented a novel paradigm for neural architecture search by training a single-stage model, from which high-quality child models of different sizes can be induced for instant deployment without retraining or finetuning
Neural architecture search (NAS) has shown promising results discovering models that are both accurate and fast. For NAS, training a one-shot model has become a popular strategy to rank the relative quality of different architectures (child models) using a single set of shared weights. However, while one-shot model weights can effective...More
PPT (Upload PPT)