Drn: Bringing Greedy Layer-Wise Training Into Time Dimension

2015 IEEE International Conference on Data Mining(2015)

引用 1|浏览72
暂无评分
摘要
Sequential data modeling has received growing interests due to its impact on real world problems. Sequential data is ubiquitous - financial transactions, advertise conversions and disease evolution are examples of sequential data. A long-standing challenge in sequential data modeling is how to capture the strong hidden correlations among complex features in high volumes. The sparsity and skewness in the features extracted from sequential data also add to the complexity of the problem.In this paper, we address these challenges from both discriminative and generative perspectives, and propose novel stochastic learning algorithms to model nonlinear variances from static time frames and their transitions. The proposed model, Deep Recurrent Network (DRN), can be trained in an unsupervised fashion to capture transitions, or in a discriminative fashion to conduct sequential labeling. We analyze the conditional independence of each functional module and tackle the diminishing gradient problem by developing a two-pass training algorithm. Extensive experiments on both simulated and real-world dynamic networks show that the trained DRN outperforms all baselines in the sequential classification task and obtains excellent performance in the regression task.
更多
查看译文
关键词
sequential data modeling,financial transactions,advertise conversions,disease evolution,complex feature extraction,generative perspectives,stochastic learning algorithms,nonlinear variances,static time frames,deep recurrent network,unsupervised training,discriminative perspectives,sequential labeling,two-pass training algorithm,simulated dynamic networks,real-world dynamic networks,trained DRN,sequential classification task,regression task
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要