NASRec: Weight Sharing Neural Architecture Search for Recommender Systems
WWW 2023(2022)
摘要
The rise of deep neural networks offers new opportunities in optimizing
recommender systems. However, optimizing recommender systems using deep neural
networks requires delicate architecture fabrication. We propose NASRec, a
paradigm that trains a single supernet and efficiently produces abundant
models/sub-architectures by weight sharing. To overcome the data multi-modality
and architecture heterogeneity challenges in the recommendation domain, NASRec
establishes a large supernet (i.e., search space) to search the full
architectures. The supernet incorporates versatile choice of operators and
dense connectivity to minimize human efforts for finding priors. The scale and
heterogeneity in NASRec impose several challenges, such as training
inefficiency, operator-imbalance, and degraded rank correlation. We tackle
these challenges by proposing single-operator any-connection sampling,
operator-balancing interaction modules, and post-training fine-tuning. Our
crafted models, NASRecNet, show promising results on three Click-Through Rates
(CTR) prediction benchmarks, indicating that NASRec outperforms both manually
designed models and existing NAS methods with state-of-the-art performance. Our
work is publicly available at https://github.com/facebookresearch/NasRec.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要