DailyMAE: Towards Pretraining Masked Autoencoders in One Day
arxiv(2024)
摘要
Recently, masked image modeling (MIM), an important self-supervised learning
(SSL) method, has drawn attention for its effectiveness in learning data
representation from unlabeled data. Numerous studies underscore the advantages
of MIM, highlighting how models pretrained on extensive datasets can enhance
the performance of downstream tasks. However, the high computational demands of
pretraining pose significant challenges, particularly within academic
environments, thereby impeding the SSL research progress. In this study, we
propose efficient training recipes for MIM based SSL that focuses on mitigating
data loading bottlenecks and employing progressive training techniques and
other tricks to closely maintain pretraining performance. Our library enables
the training of a MAE-Base/16 model on the ImageNet 1K dataset for 800 epochs
within just 18 hours, using a single machine equipped with 8 A100 GPUs. By
achieving speed gains of up to 5.8 times, this work not only demonstrates the
feasibility of conducting high-efficiency SSL training but also paves the way
for broader accessibility and promotes advancement in SSL research particularly
for prototyping and initial testing of SSL ideas. The code is available in
https://github.com/erow/FastSSL.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要