Federated Local Compact Representation Communication: Framework and Application
-
Graphical Abstract
-
Abstract
The core of federated learning (FL) is to transfer data diversity and distribution knowledge of cross-client domains. Although adopted by most FL methods, model-sharing-based communication has limitations such as unstable optimization and lack of reasonable explanation. In this paper, we propose an innovative approach that exploits a highly abstract local compact representation (LCR) as a communication carrier, paving a new feasible path for FL. LCR is not only more intuitive for multiclient joint training but also insensitive to local privacy, particularly in computer vision tasks. The proposed LCR communication-based FL framework aims to improve performance, mitigate negative transfer, and enhance optimization stability. First, based on the domain adaptation theorem, in-depth theoretical proofs guarantee the contribution of representation communication from other domains, which may lead to a tighter generalization error bound of the local domain. Second, inspired by metric learning, a federated version of the triplet (FedTriplet) loss and distribution similarity reweighting aggregation are proposed to fully digest LCR from other clients and realize the LCRC-based FL framework with better explanations. Cross-client FedTriplet transferring redresses the category boundaries in local latent space, resisting overfitting. The modified Wasserstein distance is employed for reweighting aggregation to overcome the negative transfer problem caused by non-i.i.d. factors. Finally, extensive experiments on MNIST/EMNIST and successful iris recognition applications demonstrate that the proposed LCRC framework is superior in terms of accuracy compared to mainstream model-sharing-based FL methods. The visualization results also show significant improvements in the distinguishability of the representation distribution.
-
-