Cohesion Intensive Hash Code Book Coconstruction for Efficiently Localizing Sketch Depicted Scenes

IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING(2022)

引用 14|浏览8
暂无评分
摘要
We investigate the problem of efficiently localizing sketch depicted scenes in a remote sensing image dataset. We pose the problem as that of remote sensing image retrieval with sketch queries and explore the use of hashing techniques to achieve efficient retrieval. Given two training datasets of sketches and remote sensing images that have a common set of class labels, we develop a hashing strategy that coconstructs two hash code books for the sketches and the remote sensing images separately. The hash code book coconstruction strategy encourages hash codes for the sketches and remote sensing images from different classes to be far away from one another and those from the same class to be close. This property is maintained by two cohesion intensive cues: 1) an interclass pairwise disperse cue (InterPDC) and 2) an intraclass pairwise balance cue (IntraPBC). We use the two coconstructed hash code books for training two linear mapping models that generate hash codes for sketches and remote sensing images separately. Sorting the Hamming distance between the sketch hash codes and the remote sensing image hash codes renders efficient remote sensing image retrieval with sketch queries. This enables localizing the sketch depicted scenes in the remote sensing image dataset. In addition, our method can also be used for fast localizing sketch depicted scenes in a remote sensing image of large size. Extensive experiments on public datasets validate the effectiveness and efficiency of our method.
更多
查看译文
关键词
Remote sensing,Image retrieval,Codes,Sensors,Task analysis,Convolutional neural networks,Bibliographies,Cohesion intensive,hash code books,remote sensing image retrieval with sketch queries
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要