The Case for Storage Optimization Decoupling in Deep Learning Frameworks

2021 IEEE International Conference on Cluster Computing (CLUSTER)(2021)

引用 6|浏览20
暂无评分
摘要
Deep Learning (DL) training requires efficient access to large collections of data, leading DL frameworks to implement individual I/O optimizations to take full advantage of storage performance. However, these optimizations are intrinsic to each framework, limiting their applicability and portability across DL solutions, while making them inefficient for scenarios where multiple applications compe...
更多
查看译文
关键词
Training,Deep learning,Limiting,Conferences,Prototypes,Computer architecture,Cluster computing
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要