Locality-Sensitive Hashing-Based Efficient Point Transformer with Applications in High-Energy Physics
CoRR(2024)
摘要
This study introduces a novel transformer model optimized for large-scale
point cloud processing in scientific domains such as high-energy physics (HEP)
and astrophysics. Addressing the limitations of graph neural networks and
standard transformers, our model integrates local inductive bias and achieves
near-linear complexity with hardware-friendly regular operations. One
contribution of this work is the quantitative analysis of the error-complexity
tradeoff of various sparsification techniques for building efficient
transformers. Our findings highlight the superiority of using
locality-sensitive hashing (LSH), especially OR & AND-construction LSH, in
kernel approximation for large-scale point cloud data with local inductive
bias. Based on this finding, we propose LSH-based Efficient Point Transformer
(HEPT), which combines E^2LSH with OR & AND constructions and is
built upon regular computations. HEPT demonstrates remarkable performance in
two critical yet time-consuming HEP tasks, significantly outperforming existing
GNNs and transformers in accuracy and computational speed, marking a
significant advancement in geometric deep learning and large-scale scientific
data processing. Our code is available at
.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要