Improving the Learning Performance of Client’s Local Distribution in Cyclic Federated Learning

Image Analysis and Stereology(2024)

引用 0|浏览1
暂无评分
摘要
Cyclic federated learning based on distribution information sharing and knowledge distillation (CFL_DS_KD) aims to address the challenges of non-iid data distribution and reduce communication requirements. However, when client data is extremely heterogeneous and scarce, it becomes challenging for clients to fully learn the distribution of local data using GANs, thereby affecting the overall model performance. To overcome this limitation, we propose a transfer learning approach where clients first pretrain their generators on a source domain and then fine-tune them on their local datasets. Our results on the classification of Alzheimer’s disease demonstrate that this method effectively improves client distribution learning performance and enhances the overall model performance.
更多
查看译文
关键词
federated learning,medical image processing,transfer learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要