CHA: A Caching Framework for Home-based Voice Assistant Systems

2020 IEEE/ACM Symposium on Edge Computing (SEC)(2020)

引用 7|浏览28
暂无评分
摘要
Voice assistant systems are becoming immersive in our daily lives nowadays. However, current voice assistant systems rely on the cloud for command understanding and fulfillment, resulting in unstable performance and unnecessary frequent network transmission. In this paper, we introduce CHA, an edge-based caching framework for voice assistant systems, and especially for smart homes where resource-restricted edge devices can be deployed. Located between the voice assistant device and the cloud, CHA introduces a layered architecture with modular design in each layer. By introducing an understanding module and adaptive learning, CHA understands the user's intent with high accuracy. By maintaining a cache, CHA reduces the interaction with the cloud and provides fast and stable responses in a smart home. Targeting on resource-constrained edge devices, CHA uses joint classification and model pruning on a pre-trained language model to achieve performance and system efficiency. We compare CHA to the status quo solution of voice assistant systems and show that CHA benefits voice assistant systems. We evaluate CHA on three edge devices that differ in hardware configuration and demonstrate its ability to meet the latency and accuracy demands with efficient resource utilization. Our evaluation shows that compared to the current solution for voice assistant systems, CHA can provide at least 70% speedup in responses for frequently asked voice commands with less than 13% CPU consumption, and less than 9% memory consumption when running on a Raspberry Pi.
更多
查看译文
关键词
Edge Computing,Voice Assistant Systems,Caching
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要