WeatherProof: Leveraging Language Guidance for Semantic Segmentation in Adverse Weather

Blake Gella,Howard Zhang,Rishi Upadhyay, Tiffany Chang, Nathan Wei, Matthew Waliman, Yunhao Bao,Celso de Melo,Alex Wong,Achuta Kadambi

arxiv(2024)

引用 0|浏览0
暂无评分
摘要
We propose a method to infer semantic segmentation maps from images captured under adverse weather conditions. We begin by examining existing models on images degraded by weather conditions such as rain, fog, or snow, and found that they exhibit a large performance drop as compared to those captured under clear weather. To control for changes in scene structures, we propose WeatherProof, the first semantic segmentation dataset with accurate clear and adverse weather image pairs that share an underlying scene. Through this dataset, we analyze the error modes in existing models and found that they were sensitive to the highly complex combination of different weather effects induced on the image during capture. To improve robustness, we propose a way to use language as guidance by identifying contributions of adverse weather conditions and injecting that as "side information". Models trained using our language guidance exhibit performance gains by up to 10.2 WeatherProof, up to 8.44 standard training techniques, and up to 6.21 compared to previous SOTA methods.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要