TY - GEN
T1 - Contribution-Driven Personalization for Model Heterogeneous Federated Learning
AU - Chen, Jifeng
AU - Zhang, Haibo
AU - Barnard, Amanda
N1 - Publisher Copyright:
© 2025 IEEE.
PY - 2025
Y1 - 2025
N2 - To address the challenges of hardware heterogeneity in Federated Learning (FL), several model-heterogeneous FL schemes have been proposed based on the traditional model-homogeneous approaches. Among the state-of-the-art (SOTA) model-heterogeneous FL approaches, the Partial Training (PT) approach is considered one of the most promising approaches, where submodels are extracted from the global model for local training. However, existing studies focus on either the submodel extraction scheme or the creation of personalized submodels for each client, which lack global model updating or introduce high computational complexity. This can result in poor adaptability, especially in edge computing environments with Non-IID data distribution. In this paper, we presented CDPFL, Contribution-Driven Personalization for Model Heterogeneous Federated Learning, in which the contributions made by the local clients to the global model are evaluated using the Shapley Value. Using the contribution information, Gate Recurrent Unit (GRU) is then used to determine the weight of each client in the next round of model aggregation. In this way, CDPFL is capable of controlling the update of the global model based on the contribution information. To evaluate CDPFL, we compare it against the SOTA PT-based methods. Experimental results show that our approach achieves an improvement of up to 10.17% in global model accuracy under high data heterogeneity scenarios and consistently outperforms all baselines in both high and low heterogeneity scenarios.
AB - To address the challenges of hardware heterogeneity in Federated Learning (FL), several model-heterogeneous FL schemes have been proposed based on the traditional model-homogeneous approaches. Among the state-of-the-art (SOTA) model-heterogeneous FL approaches, the Partial Training (PT) approach is considered one of the most promising approaches, where submodels are extracted from the global model for local training. However, existing studies focus on either the submodel extraction scheme or the creation of personalized submodels for each client, which lack global model updating or introduce high computational complexity. This can result in poor adaptability, especially in edge computing environments with Non-IID data distribution. In this paper, we presented CDPFL, Contribution-Driven Personalization for Model Heterogeneous Federated Learning, in which the contributions made by the local clients to the global model are evaluated using the Shapley Value. Using the contribution information, Gate Recurrent Unit (GRU) is then used to determine the weight of each client in the next round of model aggregation. In this way, CDPFL is capable of controlling the update of the global model based on the contribution information. To evaluate CDPFL, we compare it against the SOTA PT-based methods. Experimental results show that our approach achieves an improvement of up to 10.17% in global model accuracy under high data heterogeneity scenarios and consistently outperforms all baselines in both high and low heterogeneity scenarios.
KW - Contribution Evaluation
KW - Federated Learning
KW - Shapley Value
UR - https://www.scopus.com/pages/publications/105023977088
U2 - 10.1109/IJCNN64981.2025.11227874
DO - 10.1109/IJCNN64981.2025.11227874
M3 - Conference Paper
AN - SCOPUS:105023977088
T3 - Proceedings of the International Joint Conference on Neural Networks
BT - International Joint Conference on Neural Networks, IJCNN 2025 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2025 International Joint Conference on Neural Networks, IJCNN 2025
Y2 - 30 June 2025 through 5 July 2025
ER -