Exploring the Design Space of Efficient Deep Neural Networks

arxiv(2020)

引用 2|浏览26
暂无评分
摘要
This paper gives an overview of our ongoing work on the design space exploration of efficient deep neural networks (DNNs), specifically on the novel optimization perspectives that past work have mainly overlooked. We cover two complementary aspects of efficient DNN design: (1) static architecture design efficiency and (2) dynamic model execution efficiency. In the static architecture design, one of the major challenges of NAS is the low search efficiency. Different with current mainstream efficient search algorithm optimization, we identify the new perspective in efficient search space design. In the dynamic model execution, current major optimization methods still target at the model structure redundancy, e.g., weight/filter pruning, connection pruning, etc. We instead identify the new dimension of DNN feature map redundancy. By showcasing such new perspectives, further advantages could be potentially attained by integrating both current optimizations and our new perspectives.
更多
查看译文
关键词
efficient deep neural networks,design space exploration,efficient DNN design,efficient search space design,NAS,search algorithm optimization,dynamic model execution efficiency,static architecture design efficiency
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要