Sampling Informative Positives Pairs in Contrastive Learning

2023 International Conference on Sampling Theory and Applications (SampTA)(2023)

引用 0|浏览1
暂无评分
摘要
Contrastive Learning is a paradigm for learning representation functions that recover useful similarity structure in a dataset based on samples of positive (similar) and negative (dissimilar) instances. The quality of the learned representations depends crucially on the degree to which the strategies for sampling positive and negative instances reflect useful structure in the data. Typically, positive instances are sampled by randomly perturbing an anchor point using some form of data augmentation. However, not all randomly sampled positive instances are equally effective. In this paper, we analyze strategies for sampling more effective positive instances. We consider a setting where class structure in the observed data derives from analogous structure in an unobserved latent space. We propose active sampling approaches for positive instances and investigate their role in effectively learning representation functions which recover the class structure in the underlying latent space.
更多
查看译文
关键词
Contrastive Learning,Active Learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要