Mind the Remainder: Taylor’s Theorem View on Recurrent Neural Networks

IEEE Transactions on Neural Networks and Learning Systems(2022)

引用 3|浏览75
暂无评分
摘要
Recurrent neural networks (RNNs) have gained tremendous popularity in almost every sequence modeling task. Despite the effort, these kinds of discrete unstructured data, such as texts, audio, and videos, are still difficult to be embedded in the feature space. Studies in improving the neural networks have accelerated since the introduction of more complex or deeper architectures. The improvements ...
更多
查看译文
关键词
Numerical models,Mathematical model,Taylor series,Recurrent neural networks,Training,Stochastic processes,Computer architecture
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要