Set-Encoder: Permutation-Invariant Inter-Passage Attention for Listwise Passage Re-Ranking with Cross-Encoders
arxiv(2024)
摘要
Cross-encoders are effective passage re-rankers. But when re-ranking
multiple passages at once, existing cross-encoders inefficiently optimize the
output ranking over several input permutations, as their passage interactions
are not permutation-invariant. Moreover, their high memory footprint constrains
the number of passages during listwise training. To tackle these issues, we
propose the Set-Encoder, a new cross-encoder architecture that (1) introduces
inter-passage attention with parallel passage processing to ensure permutation
invariance between input passages, and that (2) uses fused-attention kernels to
enable training with more passages at a time. In experiments on TREC Deep
Learning and TIREx, the Set-Encoder is more effective than previous
cross-encoders with a similar number of parameters. Compared to larger models,
the Set-Encoder is more efficient and either on par or even more effective.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要