MTMD: Multi-Scale Temporal Memory Learning and Efficient Debiasing Framework for Stock Trend Forecasting

arxiv(2022)

引用 0|浏览11
暂无评分
摘要
Recently, machine learning methods have shown the prospects of stock trend forecasting. However, the volatile and dynamic nature of the stock market makes it difficult to directly apply machine learning techniques. Previous methods usually use the temporal information of historical stock price patterns to predict future stock trends, but the multi-scale temporal dependence of financial data and stable trading opportunities are still difficult to capture. The main problem can be ascribed to the challenge of recognizing the patterns of real profit signals from noisy information. In this paper, we propose a framework called Multiscale Temporal Memory Learning and Efficient Debiasing (MTMD). Specifically, through self-similarity, we design a learnable embedding with external attention as memory block, in order to reduce the noise issues and enhance the temporal consistency of the model. This framework not only aggregates comprehensive local information in each timestamp, but also concentrates the global important historical patterns in the whole time stream. Meanwhile, we also design the graph network based on global and local information to adaptively fuse the heterogeneous multi-scale information. Extensive ablation studies and experiments demonstrate that MTMD outperforms the state-of-the-art approaches by a significant margin on the benchmark datasets. The source code of our proposed method is available at https://github.com/MingjieWang0606/MDMT-Public.
更多
查看译文
关键词
forecasting,efficient debiasing framework,memory,learning,trend,multi-scale
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要