Heterogeneous Federated Learning with Scalable Server Mixture-of-Experts
Published in Proceedings of the International Joint Conference on Artificial Intelligence (IJCAI), 2025
Proposed a novel Federated Mixture-of-Experts (Fed-MoE) framework to address the challenges of deploying large models in power-constrained environments. Designed an asymmetric FL mechanism where compact client models are aggregated into a large server-side MoE model, enabling efficient learning from heterogeneous data.
Recommended citation: Jingang Jiang*, Yanzhao Chen*, Xiangyang Liu, Haiqi Jiang, and Chenyou Fan. Heterogeneous federated learning with scalable server mixture-of-experts. In Proceedings of the International Joint Conference on Artificial Intelligence (IJCAI), 2025. Co-first authors: Jingang Jiang and Yanzhao Chen.
Download Paper
