Feature Attention Parallel Aggregation Network for Single Image Haze Removal

IEEE ACCESS(2022)

引用 2|浏览0
暂无评分
摘要
Images captured in hazy weather often suffer from color distortion and texture blur due to turbid media suspended in the atmosphere. In this paper, we propose a Feature Attention Parallel Aggregation Network (FAPANet) to restore a clear image directly from the corresponding hazy input. It adopts the encoder-decoder structure while incorporating residual learning and attention mechanism. FAPANet consists of two key modules: a novel feature attention aggregation module (FAAM) and an adaptive feature fusion module (AFFM). FAAM recalibrates features by integrating channel attention and pixel attention in parallel to stimulate useful information and suppress redundant features. The shallow and deep layers of neural networks tend to characterize the low-level and high-level semantic features of images, respectively, so we introduce AFFM to fuse these two features adaptively. Meanwhile, a joint loss function, composed of L1 loss, perceptual loss, and structural similarity (SSIM) loss, is employed in the training stage for better results with more vivid colors and richer details. Comprehensive experiments on both synthetic and real-world images demonstrate the impressive performance of the proposed approach.
更多
查看译文
关键词
Atmospheric modeling, Image restoration, Image color analysis, Scattering, Meteorology, Degradation, Visualization, Attention mechanism, deep learning, feature fusion, image restoration, single image dehazing
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要