Enabling On-device Continual Learning with Binary Neural Networks
CoRR(2024)
摘要
On-device learning remains a formidable challenge, especially when dealing
with resource-constrained devices that have limited computational capabilities.
This challenge is primarily rooted in two key issues: first, the memory
available on embedded devices is typically insufficient to accommodate the
memory-intensive back-propagation algorithm, which often relies on
floating-point precision. Second, the development of learning algorithms on
models with extreme quantization levels, such as Binary Neural Networks (BNNs),
is critical due to the drastic reduction in bit representation. In this study,
we propose a solution that combines recent advancements in the field of
Continual Learning (CL) and Binary Neural Networks to enable on-device training
while maintaining competitive performance. Specifically, our approach leverages
binary latent replay (LR) activations and a novel quantization scheme that
significantly reduces the number of bits required for gradient computation. The
experimental validation demonstrates a significant accuracy improvement in
combination with a noticeable reduction in memory requirement, confirming the
suitability of our approach in expanding the practical applications of deep
learning in real-world scenarios.
更多查看译文
AI 理解论文
溯源树
样例
![](https://originalfileserver.aminer.cn/sys/aminer/pubs/mrt_preview.jpeg)
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要