Online Class Incremental Learning with One-Vs-All Classifiers for Resource Constrained Devices.

Baptiste Wagner,Denis Pellerin, Serge Olympieff, Sylvain Huet

ISPA(2023)

引用 0|浏览0
暂无评分
摘要
Online Class Incremental Learning (OCIL) aims to learn new classes from a data stream where samples arrive in batches, one after the other. Avoiding catastrophic forgetting, the phenomenon of forgetting old classes when learning new ones is the main challenge in OCIL. Replay-based methods counteract catastrophic forgetting by storing around 10% of the data stream in a memory buffer. Upon learning new classes, the model is updated by replaying old class images sampled from memory. OCIL holds significant promise for smart devices, such as home robots or smartphones, as incrementally learning new object instances enables personalized interactions with the environment. Although, these devices present limited computing and storage capabilities to allow on-device training in real-time. In this paper, we propose a novel replay-based method called ILOVA (Incremental Learning of One-Vs-All classifiers) and show that it achieves the best balance between accuracy, forgetting, computing time, and memory footprint on three benchmark datasets. Additionally, we conduct a comparative analysis of existing replay-based methods for OCIL with respect to embedded constraints. Specifically in the studied scenarios, models can store only one to ten samples per class. In the most challenging configuration, where only one sample per class is stored, our method outperforms the second-best method by up to 16 points in accuracy within 2.5 times less computation time.
更多
查看译文
关键词
Online learning,Incremental learning,Catas-trophic forgetting,Replay
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要