Yi-Ming Lin, Yuan Gao, Mao-Guo Gong, Si-Jia Zhang, Yuan-Qiao Zhang, Zhi-Yuan Li. Federated Learning on Multimodal Data: A Comprehensive Survey[J]. Machine Intelligence Research, 2023, 20(4): 539-553. DOI: 10.1007/s11633-022-1398-0
Citation: Yi-Ming Lin, Yuan Gao, Mao-Guo Gong, Si-Jia Zhang, Yuan-Qiao Zhang, Zhi-Yuan Li. Federated Learning on Multimodal Data: A Comprehensive Survey[J]. Machine Intelligence Research, 2023, 20(4): 539-553. DOI: 10.1007/s11633-022-1398-0

Federated Learning on Multimodal Data: A Comprehensive Survey

  • With the growing awareness of data privacy, federated learning (FL) has gained increasing attention in recent years as a major paradigm for training models with privacy protection in mind, which allows building models in a collaborative but private way without exchanging data. However, most FL clients are currently unimodal. With the rise of edge computing, various types of sensors and wearable devices generate a large amount of data from different modalities, which has inspired research efforts in multimodal federated learning (MMFL). In this survey, we explore the area of MMFL to address the fundamental challenges of FL on multimodal data. First, we analyse the key motivations for MMFL. Second, the currently proposed MMFL methods are technically classified according to the modality distributions and modality annotations in MMFL. Then, we discuss the datasets and application scenarios of MMFL. Finally, we highlight the limitations and challenges of MMFL and provide insights and methods for future research.
  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return