Learning Time Series Associated Event Sequences With Recurrent Point Process Networks.

IEEE transactions on neural networks and learning systems(2019)

引用 92|浏览840
暂无评分
摘要
Real-world sequential data are often generated based on complicated and latent mechanisms, which can be formulated as event sequences occurring in the continuous time domain. In addition, continuous signals may often be associated with event sequences and be formulated as time series with fixed time lags. Traditionally, event sequences are often modeled by parametric temporal point processes, which use explicitly defined conditional intensity functions to quantify the occurrence rates of events. However, these parametric models often merely take one-side information from event sequences into account while ignoring the information from concurrent time series, and their intensity functions are usually designed for specific tasks dependent on prior knowledge. To tackle the above-mentioned problems, we propose a model called recurrent point process networks which instantiates temporal point process models with temporal recurrent neural networks (RNNs). In particular, the intensity functions of the proposed model are modeled by two RNNs: one temporal RNN capturing the relationships among events and the other RNN updating intensity functions based on time series. Furthermore, an attention mechanism is introduced, which uncovers influence strengths among events with good interpretability. Focusing on challenging tasks such as temporal event prediction and underlying relational network mining, we demonstrate the superiority of our model on both synthetic and real-world data.
更多
查看译文
关键词
Attentional models,recurrent point process networks (RPPNs),relation discovery,temporal point process
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要