CrossNorm and SelfNorm for Generalization under Distribution Shifts

2021 IEEE/CVF International Conference on Computer Vision (ICCV)(2021)

Cited 45|Views237
No score
Abstract
Traditional normalization techniques (e.g., Batch Normalization and Instance Normalization) generally and simplistically assume that training and test data follow the same distribution. As distribution shifts are inevitable in real-world applications, well-trained models with previous normalization methods can perform badly in new environments. Can we develop new normalization methods to improve g...
More
Translated text
Key words
Training,Bridges,Computer vision,Codes,Robustness,Task analysis
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined