Privacy-Preserving DNN Training with Prefetched Meta-Keys on Heterogeneous Neural Network Accelerators.

DAC(2023)

引用 1|浏览1
暂无评分
摘要
The embedded software may migrate the collected data to the server for DNN computation acceleration, which may compromise privacy. We propose a DNN computation framework that combines TEE and NNA to address the privacy leakage problem. We design an NNA-friendly encryption method that enables NNA to correctly compute the encrypted linear input. Facing the overhead of TEE-NNA interaction, we design a pipeline-based prefetch mechanism that can reduce the TEE interaction overhead. Experimentally, our approach proves to be compatible with a wide range of NPUs and TPUs, and improves the performance by 8-19 times over the TEE scheme.
更多
查看译文
关键词
deep learning, privacy preserving, neural network accelerator, cloud computing, trusted execution environment
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要