Skip to main content
Log in

Federated Dynamic Client Selection for Fairness Guarantee in Heterogeneous Edge Computing

  • Regular Paper
  • Published:
Journal of Computer Science and Technology Aims and scope Submit manuscript

Abstract

Federated learning has emerged as a distributed learning paradigm by training at each client and aggregating at a parameter server. System heterogeneity hinders stragglers from responding to the server in time with huge communication costs. Although client grouping in federated learning can solve the straggler problem, the stochastic selection strategy in client grouping neglects the impact of data distribution within each group. Besides, current client grouping approaches make clients suffer unfair participation, leading to biased performances for different clients. In order to guarantee the fairness of client participation and mitigate biased local performances, we propose a federated dynamic client selection method based on data representativity (FedSDR). FedSDR clusters clients into groups correlated with their own local computational efficiency. To estimate the significance of client datasets, we design a novel data representativity evaluation scheme based on local data distribution. Furthermore, the two most representative clients in each group are selected to optimize the global model. Finally, the DYNAMIC-SELECT algorithm updates local computational efficiency and data representativity states to regroup clients after periodic average aggregation. Evaluations on real datasets show that FedSDR improves client participation by 27.4%, 37.9%, and 23.3% compared with FedAvg, TiFL, and FedSS, respectively, taking fairness into account in federated learning. In addition, FedSDR surpasses FedAvg, FedGS, and FedMS by 21.32%, 20.4%, and 6.90%, respectively, in local test accuracy variance, balancing the performance bias of the global model across clients.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

References

  1. McMahan B, Moore E, Ramage D, Hampson S, Arcas B A Y. Communication-efficient learning of deep networks from decentralized data. In Proc. the 20th International Conference on Artificial Intelligence and Statistics, Apr. 2017, pp.1273–1282.

  2. Xu J, Wang H Q. Client selection and bandwidth allocation in wireless federated learning networks: A long-term perspective. IEEE Trans. Wireless Communications, 2021, 20(2): 1188–1200. DOI: https://doi.org/10.1109/TWC.2020.3031503.

    Article  Google Scholar 

  3. Wei K, Li J, Ding M, Ma C, Yang H H, Farokhi F, Jin S, Quek T Q S, Poor H V. Federated learning with differential privacy: Algorithms and performance analysis. IEEE Trans. Information Forensics and Security, 2020, 15: 3454–3469. DOI: https://doi.org/10.1109/TIFS.2020.2988575.

    Article  Google Scholar 

  4. Chu X W, Jiang H B, Li B, Wang D, Wang W. Editorial: Advances in mobile, edge and cloud computing. Mobile Networks and Applications, 2022, 27(1): 219–221. DOI: https://doi.org/10.1007/s11036-020-01654-9.

    Article  Google Scholar 

  5. Zhang F D, Kuang K, Liu Y X, Chen L, Wu C, Wu F, Lu J X, Shao Y F, Xiao J. Unified group fairness on federated learning. arXiv: 2111.04986, 2021. https://arxiv.org/abs/2111.04986, Jan. 2024.

  6. Hu C, Lu R, Wang D. FEVA: A federated video analytics architecture for networked smart cameras. IEEE Network, 2021, 35(6): 163–170. DOI: https://doi.org/10.1109/MNET.001.2100261.

    Article  Google Scholar 

  7. Fraboni Y, Vidal R, Kameni L, Lorenzi M. Clustered sampling: Low-variance and improved representativity for clients selection in federated learning. In Proc. the 38th International Conference on Machine Learning, Jul. 2021, pp.3407–3416.

  8. Li T, Sahu A K, Zaheer M, Sanjabi M, Talwalkar A, Smith V. Federated optimization in heterogeneous networks. In Proc. Machine Learning and Systems, Mar. 2020, pp.429–450.

  9. Bonawitz K A, Eichner H, Grieskamp W, Huba D, Ingerman A, Ivanov V, Kiddon C, Konecný J, Mazzocchi S, McMahan B, Van Overveldt T, Petrou D, Ramage D, Roselander J. Towards federated learning at scale: System design. In Proc. Machine Learning and Systems, Mar. 31–Apr. 2, 2019, pp.374–388.

  10. Gao H C, Thai M T, Wu J. When decentralized optimization meets federated learning. IEEE Network, 2023, pp.1–7. DOI: https://doi.org/10.1109/MNET.132.2200530.

  11. Hong J Y, Zhu Z D, Yu S Y, Wang Z Y, Dodge H H, Zhou J Y. Federated adversarial debiasing for fair and transferable representations. In Proc. the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining, Aug. 2021, pp.617–627. DOI: 10.1145/3447548.3467281.

  12. Livni R, Shalev-Shwartz S, Shamir O. On the computational efficiency of training neural networks. In Proc. the 27th International Conference on Neural Information Processing Systems, Dec. 2014, pp.855–863.

  13. Hao X H, Ren W, Xiong R T, Zheng X H, Zhu T Q, Xiong N N. Fair and autonomous sharing of federate learning models in mobile Internet of Things. arXiv: 2007.10650, 2020. https://arxiv.org/abs/2007.10650, Jan. 2024.

  14. Horváth S, Laskaridis S, Almeida M, Leondiadis I, Venieris S I, Lane N D. FjORD: Fair and accurate federated learning under heterogeneous targets with ordered dropout. In Proc. the 34th Neural Information Processing Systems, Dec. 2021, pp.12876–12889.

  15. Xin F, Zhang J H, Luo J Z, Dong F. Federated learning client selection mechanism under system and data heterogeneity. In Proc. the 25th IEEE International Conference on Computer Supported Cooperative Work in Design (CSCWD), May. 2022, pp.1239–1244. DOI: 10.1109/CSCWD54268.2022.9776061.

  16. Zhao F P, Huang Y, Sai A M V V, Wu Y B. A clusterbased solution to achieve fairness in federated learning. In Proc. the 2020 IEEE Int. Conf. Parallel & Distributed Processing with Applications, Big Data & Cloud Computing, Sustainable Computing & Communications, Social Computing & Networking (ISPA/BDCloud/SocialCom/SustainCom), Dec. 2020, pp.875–882. DOI: https://doi.org/10.1109/ISPABDCloud-SocialCom-SustainCom51426.2020.00135.

  17. Zhao Z Y, Joshi G. A dynamic reweighting strategy for fair federated learning. In Proc. the 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), May 2022, pp.8772–8776. DOI: 10.1109/ICASSP43922.2022.9746300.

  18. Amiri S, Belloum A, Nalisnick E, Klous S, Gommans L. On the impact of non-IID data on the performance and fairness of differentially private federated learning. In Proc. the 52nd Annual IEEE/IFIP International Conference on Dependable Systems and Networks Workshops (DSN-W), Jun. 2022, pp.52–58. DOI: 10.1109/DSNW54100.2022.00018.

  19. Zhang W B, Weiss J C. Fair decision-making under uncertainty. In Proc. the 2021 IEEE International Conference on Data Mining (ICDM), Dec. 2021, pp.886–895. DOI: 10.1109/ICDM51629.2021.00100.

  20. Chai Z, Ali A, Zawad S, Truex S, Anwar A, Baracaldo N, Zhou Y, Ludwig H, Yan F, Cheng Y. TiFL: A tier-based federated learning system. In Proc. the 29th International Symposium on High-Performance Parallel and Distributed Computing, Jun. 2020, pp.125–136. DOI: 10.1145/3369583.3392686.

  21. Ek S, Lalanda P, Portet F. Federated learning within pervasive heterogeneous environments. In Proc. the 2022 IEEE International Conference on Pervasive Computing and Communications Workshops and Other Affiliated Events (PerCom Workshops), Mar. 2022, pp.134–135. DOI: 10.1109/PerComWorkshops53856.2022.9767288.

  22. Zhou Y P, Fu Y, Luo Z X, Hu M, Wu D, Sheng Q Z, Yu S. The role of communication time in the convergence of federated edge learning. IEEE Trans. Vehicular Technology, 2022, 71(3): 3241–3254. DOI: https://doi.org/10.1109/TVT.2022.3144099.

    Article  Google Scholar 

  23. Karimireddy S P, Kale S, Mohri M, Reddi S J, Stich S U, Suresh A T. SCAFFOLD: Stochastic controlled averaging for federated learning. In Proc. the 37th International Conference on Machine Learning, Jul. 2020, pp.5132–5143.

  24. Rong Y J, Liu Y A. Staged text clustering algorithm based on K-means and hierarchical agglomeration clustering. In Proc. the 2020 IEEE International Conference on Artificial Intelligence and Computer Applications (ICAICA), Jun. 2020, pp.124–127. DOI: 10.1109/ICAICA50127.2020.9182394.

  25. Nadiger C, Kumar A, Abdelhak S. Federated reinforcement learning for fast personalization. In Proc. the 2nd IEEE International Conference on Artificial Intelligence and Knowledge Engineering (AIKE), Jun. 2019, pp.123–127. DOI: 10.1109/AIKE.2019.00031.

  26. Du X Q, He Y L, Huang J Z. Random sample partitionbased clustering ensemble algorithm for big data. In Proc. the 2021 IEEE International Conference on Big Data (Big Data), Dec. 2021, pp.5885–5887. DOI: 10.1109/BigData52589.2021.9671297.

  27. Ji S Q, Xing R W. Clustering ensemble of massive data based on trusted region. In Proc. the 3rd International Conference on Machine Learning, Big Data and Business Intelligence (MLBDBI), Dec. 2021, pp.337–340. DOI: 10.1109/MLBDBI54094.2021.00070.

  28. Rizk E, Vlaski S, Sayed A H. Optimal importance sampling for federated learning. In Proc. the 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Jun. 2021, pp.3095–3099. DOI: 10.1109/ICASSP39728.2021.9413655.

  29. Gong B Y, Xing T Z, Liu Z D, Xi W, Chen X J. Adaptive client clustering for efficient federated learning over Non-IID and imbalanced data. IEEE Trans. Big Data, 2022. DOI: https://doi.org/10.1109/TBDATA.2022.3167994.

  30. Baghersalimi S, Teijeiro T, Atienza D, Aminifar A. Personalized real-time federated learning for epileptic seizure detection. IEEE Journal of Biomedical and Health Informatics, 2022, 26(2): 898–909. DOI: https://doi.org/10.1109/JBHI.2021.3096127.

    Article  Google Scholar 

  31. Mohammed I, Tabatabai S, Al-Fuqaha A, El Bouanani F, Qadir J, Qolomany B, Guizani M. Budgeted online selection of candidate IoT clients to participate in federated learning. IEEE Internet of Things Journal, 2021, 8(7): 5938–5952. DOI: https://doi.org/10.1109/JIOT.2020.3036157.

    Article  Google Scholar 

  32. Correa J, Cristi A, Feuilloley L, Oosterwijk T, Tsigonias- Dimitriadis A. The secretary problem with independent sampling. In Proc. the 32nd Annual ACM-SIAM Symposium on Discrete Algorithms, Jan. 2021, pp.2047–2058. DOI: 10.1137/1.9781611976465.122.

  33. Nishio T, Yonetani R. Client selection for federated learning with heterogeneous resources in mobile edge. In Proc. the ICC 2019–2019 IEEE International Conference on Communications (ICC), May 2019. DOI: https://doi.org/10.1109/ICC.2019.8761315.

  34. Yoshida N, Nishio T, Morikura M, Yamamoto K. MABbased client selection for federated learning with uncertain resources in mobile networks. In Proc. the 2020 IEEE Globecom Workshops (GC Wkshps), Dec. 2020. DOI: https://doi.org/10.1109/GCWkshps50303.2020.9367421.

  35. Yin T, Li L X, Lin W S, Ma D H, Han Z. Grouped Federated Learning: A decentralized learning framework with low latency for heterogeneous devices. In Proc. the 2022 IEEE International Conference on Communications Workshops (ICC Workshops), May 2022, pp.55–60. DOI: 10.1109/ICCWorkshops53468.2022.9814558.

  36. Ma J H, Sun X H, Xia W C, Wang X J, Chen X, Zhu H B. Client Selection based on label quantity information for federated learning. In Proc. the 32nd IEEE Annual International Symposium on Personal, Indoor and Mobile Radio Communications (PIMRC), Sept. 2021. DOI: 10.1109/PIMRC50174.2021.9569487.

  37. Xu X H, Duan S J, Zhang J R, Luo Y Z, Zhang D Y. Optimizing federated learning on device heterogeneity with a sampling strategy. In Proc. the 29th IEEE/ACM International Symposium on Quality of Service (IWQOS), Jun. 2021. DOI: 10.1109/IWQOS52092.2021.9521361.

  38. Paszke A, Gross S, Massa F, Lerer A, Bradbury J, Chanan G, Killeen T, Lin Z M, Gimelshein N, Antiga L, Desmaison A, Köpf A, Yang E, DeVito Z, Raison M, Tejani A, Chilamkurthy S, Steiner B, Fang L, Bai J J, Chintala S. PyTorch: An imperative style, high-performance deep learning library. In Proc. the 33rd International Conference on Neural Information Processing Systems, Dec. 2019, Article No. 721.

  39. Giulini M, Menichetti R, Shell M S, Potestio R. An information- theory-based approach for optimal model reduction of biomolecules. Journal of Chemical Theory and Computation, 2020, 16(11): 6795–6813. DOI: https://doi.org/10.1021/acs.jctc.0c00676.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jun Wu.

Supplementary Information

ESM 1

(PDF 158 kb)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Mao, YC., Shen, LJ., Wu, J. et al. Federated Dynamic Client Selection for Fairness Guarantee in Heterogeneous Edge Computing. J. Comput. Sci. Technol. 39, 139–158 (2024). https://doi.org/10.1007/s11390-023-2972-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11390-023-2972-9

Keywords

Navigation