Fully embedded time series generative adversarial networks

Neural Computing and Applications(2024)

引用 0|浏览10
暂无评分
摘要
Generative adversarial networks should produce synthetic data that fits the underlying distribution of the data being modeled. For real-valued time series data, this implies the need to simultaneously capture the static distribution of the data, but also the full temporal distribution of the data for any potential time horizon. This temporal element produces a more complex problem that can potentially leave current solutions under-constrained, unstable during training, or prone to varying degrees of mode collapse. In FETSGAN, entire sequences are translated directly to the generator’s sampling space using a seq2seq style adversarial autoencoder, where adversarial training is used to match the training distribution in both the feature space and the lower-dimensional sampling space. This additional constraint provides a loose assurance that the temporal distribution of the synthetic samples will not collapse. In addition, the First Above Threshold operator is introduced to supplement the reconstruction of encoded sequences, which improves training stability and the overall quality of the synthetic data being generated. These novel contributions demonstrate a significant improvement to the current state of the art for adversarial learners in qualitative measures of temporal similarity and quantitative predictive ability of data generated through FETSGAN.
更多
查看译文
关键词
Generative adversarial networks (GANs),Adversarial autoencoder,Synthetic time series data
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要