FedLinked: A client-wise distilled representation based semi-supervised collaborative multitask learning scheme

IEEE International Joint Conference on Neural Network (IJCNN)(2022)

引用 0|浏览0
暂无评分
摘要
Multitask learning in distributed environments is an important problem in the field of federated learning. In many practical problems, human or machine experts try to solve different, but highly correlated tasks, thus their performance can be improved through collaboration, but their private datasets and model parameters cannot be shared with other participants because of business and privacy issues. Semi-supervised learning is also actively researched nowadays, which is motivated by the utilization of the usually free, large set of unlabeled data in concept and representation learning. These unlabeled samples can be utilized in the knowledge sharing of the participants during their collaboration. We propose a method, called FedLinked, which combines the capabilities of these area (federated, multitask and semi-supervised learning) related algorithms in the form of a privacy preserving complex solution-block targeting semi-supervised multitask cross-silo problems. FedLinked is a collaborative learning method, which is implemented through the regularization of representation learning based on the cross-utility of participating clients. We evaluate FedLinked on image classification problems and compare its performance to FedAvg based and non-cooperating clients based solutions in multitask scenarios. Besides, a detailed analysis of our proposed method is provided.
更多
查看译文
关键词
Federated learning,Multitask learning,Semi-Supervised learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要