Dual-DIANet: A sharing-learnable multi-task network based on dense information aggregation

NEUROCOMPUTING(2024)

引用 0|浏览24
暂无评分
摘要
Designing neural networks that balance shared features and task-specific ones is a major challenge in multi-task learning. To solve this issue, we aggregate information from both multi-scale and multi-level perspectives for a more comprehensive understanding of the multi-task features, which enables the network to learn a better sharing way. Specifically, we first introduce a basic Multi-scale-focused Dense Information Aggregation Network, where learnable fusion modules are used to connect two task-biased subnets and control the feature sharing. The fusion modules, consisting of densely cascaded dilated convolutions and scale-wise attention blocks, adaptively aggregate multi-scale information for each task. To further exploit multi-level information, the modules at different levels are mutually connected in a dense manner and guided by auxiliary supervision. By combining these two aspects of information aggregation, a Dual Dense Information Aggregation Network with a strong ability to learn appropriate sharing is finally proposed. Comprehensive experiments are reported on NYUDv2, SUN RGB-D, and Mini-Taskonomy to show the effectiveness of our method.
更多
查看译文
关键词
Deep learning,Multi-task learning,Information aggregation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要