Differentially Private Federated Learning With Importance Client Sampling

IEEE Transactions on Consumer Electronics(2023)

引用 0|浏览0
暂无评分
摘要
As numerous consumer electronics applications like smartphones and wearables generate lots of distributed data daily, consumer desire to safely and efficiently tackle private and isolated data. Federated learning (FL) is hopeful to satisfy the above requirement due to strong data security and applicability to large-scale scenarios. But diverse clients inevitably cause non-independent and identically distributed (non-iid) data among clients, which severely hinders performance analysis. Besides, affected by non-iid data, the participating clients are typically heterogeneous, which induces the client sampling problem. More importantly, albeit FL can enhance privacy via data localization, for highly secret data like physiological data from wearables, FL should possess stronger security to prevent third-party attacks. For data heterogeneity, client sampling, and privacy security, we propose differential privacy (DP) enabled and importance-aware FL algorithm DPFLICS to jointly handle these problems. Specifically, we utilize the truncated concentrated DP to tightly track the end-to-end privacy loss. To attain better sampling, the server selects partial clients with the probability derived from our importance client sampling. Moreover, to further improve performance, we also leverage the adaptive YOGI optimizer on the server side, which is an adaptive gradient method improved from the widely-used ADAM optimization. Finally, the multiple experiments exhibit the effectiveness of our method.
更多
查看译文
关键词
Consumer electronics,federated learning,differential privacy,client sampling,adaptive optimization
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要