Toward Training Recurrent Neural Networks for Lifelong Learning
Neural computation, pp. 1-35, 2019.
<para xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">Catastrophic forgetting and capacity saturation are the central challenges of any parametric lifelong learning system. In this work, we study these challenges in the context of sequential supervised learning with an emphasis on recurrent neural...More
Full Text (Upload PDF)