Efficient multi-objective neural architecture search framework via policy gradient algorithm

Bo Lyu,Yin Yang,Yuting Cao, Pengcheng Wang,Jian Zhu, Jingfei Chang,Shiping Wen

INFORMATION SCIENCES(2024)

引用 0|浏览19
暂无评分
摘要
Differentiable architecture search plays a prominent role in Neural Architecture Search (NAS) and exhibits preferable efficiency than traditional heuristic NAS methods, including those based on evolutionary algorithms (EA) and reinforcement learning (RL). However, differentiable NAS methods encounter challenges when dealing with non-differentiable objectives like energy efficiency, resource constraints, and other non-differentiable metrics, especially under multiobjective search scenarios. While the multi-objective NAS research addresses these challenges, the individual training required for each candidate architecture demands significant computational resources. To bridge this gap, this work combines the efficiency of the differentiable NAS with metrics compatibility in multi-objective NAS. The architectures are discretely sampled by the architecture parameter alpha within the differentiable NAS framework, and alpha are directly optimised by the policy gradient algorithm. This approach eliminates the need for a sampling controller to be learned and enables the encompassment of non-differentiable metrics. We provide an efficient NAS framework that can be readily customized to address real-world multi-objective NAS (MNAS) scenarios, encompassing factors such as resource limitations and platform specialization. Notably, compared with other multi-objective NAS methods, our NAS framework effectively decreases the computational burden (accounting for just 1/6 of the NSGA-Net). This search framework is also compatible with the other efficiency and performance improvement strategies under the differentiable NAS framework.
更多
查看译文
关键词
Neural architecture search,Reinforcement learning,Non-differentiable,Supernetwork
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要