Poster

Federated Learning from Only Unlabeled Data with Class-conditional-sharing Clients

Nan Lu · Zhao Wang · Xiaoxiao Li · Gang Niu · Qi Dou · Masashi Sugiyama

Keywords: [ unlabeled data ]

[ Abstract ]
[ Visit Poster at Spot E1 in Virtual World ] [ OpenReview
Thu 28 Apr 2:30 a.m. PDT — 4:30 a.m. PDT

Abstract:

Supervised federated learning (FL) enables multiple clients to share the trained model without sharing their labeled data. However, potential clients might even be reluctant to label their own data, which could limit the applicability of FL in practice. In this paper, we show the possibility of unsupervised FL whose model is still a classifier for predicting class labels, if the class-prior probabilities are shifted while the class-conditional distributions are shared among the unlabeled data owned by the clients. We propose federation of unsupervised learning (FedUL), where the unlabeled data are transformed into surrogate labeled data for each of the clients, a modified model is trained by supervised FL, and the wanted model is recovered from the modified model. FedUL is a very general solution to unsupervised FL: it is compatible with many supervised FL methods, and the recovery of the wanted model can be theoretically guaranteed as if the data have been labeled. Experiments on benchmark and real-world datasets demonstrate the effectiveness of FedUL. Code is available at https://github.com/lunanbit/FedUL.

Chat is not available.