Select High-Level Features: Efficient Experts from a Hierarchical Classification Network

André Kelm, Niels Hannemann, Bruno Heberle, Lucas Schmidt,Tim Rolff,Christian Wilms,Ehsan Yaghoubi,Simone Frintrop

arxiv(2024)

引用 0|浏览1
暂无评分
摘要
This study introduces a novel expert generation method that dynamically reduces task and computational complexity without compromising predictive performance. It is based on a new hierarchical classification network topology that combines sequential processing of generic low-level features with parallelism and nesting of high-level features. This structure allows for the innovative extraction technique: the ability to select only high-level features of task-relevant categories. In certain cases, it is possible to skip almost all unneeded high-level features, which can significantly reduce the inference cost and is highly beneficial in resource-constrained conditions. We believe this method paves the way for future network designs that are lightweight and adaptable, making them suitable for a wide range of applications, from compact edge devices to large-scale clouds. In terms of dynamic inference our methodology can achieve an exclusion of up to 88.7 % of parameters and 73.4 % fewer giga-multiply accumulate (GMAC) operations, analysis against comparative baselines showing an average reduction of 47.6 % in parameters and 5.8 % in GMACs across the cases we evaluated.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要