Cyclic Self-Training with Proposal Weight Modulation for Cross-Supervised Object Detection.

IEEE transactions on image processing : a publication of the IEEE Signal Processing Society(2023)

引用 0|浏览27
Weakly-supervised object detection (WSOD), which requires only image-level annotations for training detectors, has gained enormous attention. Despite recent rapid advance in WSOD, there remains a large performance gap compared with fully-supervised object detection. To narrow the performance gap, we study cross-supervised object detection (CSOD), where existing classes (base classes) have instance-level annotations while newly added classes (novel classes) only need image-level annotations. For improving localization accuracy, we propose a Cyclic Self-Training (CST) method to introduce instance-level supervision into a commonly used WSOD method, online instance classifier refinement (OICR). Our proposed CST consists of forward pseudo labeling and backward pseudo labeling. Specifically, OICR exploits the forward pseudo labeling to generate pseudo ground-truth bounding-boxes for all classes, thus enabling instance classifier training. Then, the backward pseudo labeling is designed to generate pseudo ground-truth bounding-boxes of higher quality for novel classes by fusing the predictions of the instance classifiers. As a result, both novel and base classes will have bounding-box annotations for training, alleviating the supervision inconsistency between base and novel classes. In the forward pseudo labeling, the generated pseudo ground-truths may be misaligned with objects and thus introduce poor-quality examples for training the ICs. To reduce the impacts of these poor-quality training examples, we propose a Proposal Weight Modulation (PWM) module learned in a class-agnostic and contrastive manner by exploiting bounding-box annotations of base classes. Experiments on PASCAL VOC and MS COCO datasets demonstrate the superiority of our proposed method.
AI 理解论文