Lane Transformer: A High-Efficiency Trajectory Prediction Model

Zhibo Wang, Jiayu Guo, Zhengming Hu,Haiqiang Zhang,Junping Zhang,Jian Pu

IEEE Open Journal of Intelligent Transportation Systems(2023)

引用 6|浏览34
暂无评分
摘要
Trajectory prediction is a crucial step in the pipeline for autonomous driving because it not only improves the planning of future routes, but also ensures vehicle safety. On the basis of deep neural networks, numerous trajectory prediction models have been proposed and have already achieved high performance on public datasets due to the well-designed model structure and complex optimization procedure. However, the majority of these methods overlook the fact that vehicles’ limited computing resources can be utilized for online real-time inference. We proposed a Lane Transformer to achieve high accuracy and efficiency in trajectory prediction to tackle this problem. On the one hand, inspired by the well-known transformer, we use attention blocks to replace the commonly used Graph Convolution Network (GCN) in trajectory prediction models, thereby drastically reducing the time cost while maintaining the accuracy. In contrast, we construct our prediction model to be compatible with TensorRT, allowing it to be further optimized and easily transformed into a deployment-friendly form of TensorRT. Experiments demonstrate that our model outperforms the baseline LaneGCN model in quantitative prediction accuracy on the Argoverse dataset by a factor of $10\times $ to $25\times $ . Our $7ms$ inference time is the fastest among all open source methods currently available. Our code is publicly available at: https://github.com/mmdzb/Lane-Transformer .
更多
查看译文
关键词
Trajectory prediction,transformer,multi-head attention,TensorRT
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要