JaxPruner: A concise library for sparsity research

Joo Hyung Lee,Wonpyo Park, Nicole Mitchell,Jonathan Pilault,Johan Obando-Ceron, Han-Byul Kim,Namhoon Lee,Elias Frantar, Yun Long,Amir Yazdanbakhsh,Shivani Agrawal,Suvinay Subramanian,Xin Wang,Sheng-Chun Kao, Xingyao Zhang,Trevor Gale,Aart Bik, Woohyun Han, Milen Ferev, Zhonglin Han, Hong-Seok Kim,Yann Dauphin, Karolina Dziugaite,Pablo Samuel Castro,Utku Evci

CoRR(2023)

引用 0|浏览83
暂无评分
摘要
This paper introduces JaxPruner, an open-source JAX-based pruning and sparse training library for machine learning research. JaxPruner aims to accelerate research on sparse neural networks by providing concise implementations of popular pruning and sparse training algorithms with minimal memory and latency overhead. Algorithms implemented in JaxPruner use a common API and work seamlessly with the popular optimization library Optax, which, in turn, enables easy integration with existing JAX based libraries. We demonstrate this ease of integration by providing examples in four different codebases: Scenic, t5x, Dopamine and FedJAX and provide baseline experiments on popular benchmarks.
更多
查看译文
关键词
concise library
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要