Skip to main content

Monitoring Concept Drift in Continuous Federated Learning Platforms

  • Conference paper
  • First Online:
Advances in Intelligent Data Analysis XXII (IDA 2024)

Abstract

Continuous federated learning (CFL), a recently emerging learning paradigm that facilitates collaborative, yet privacy-preserving machine learning (ML), bears the potential to shape the future of distributed ML. In spite of its great potential, it is - similar to continuous ML - prone to suffer from concept drift (a change in data properties over time). In turn, CFL can greatly benefit from employing drift detection to react adequately to emerging drifts. Although various such approaches exist, respective research lacks application of drift detection to CFL with dynamic client participation as well as detailed analysis of the advantages of different drift detection approaches such as error-based or data-based drift detection. To this end, we apply these drift detection approaches to a CFL platform that allows new clients to join even after the training has started and measure the negative impact of concept drift on model performance. Moreover, we uncover distinct differences between the error- and data-based drift detection. In particular, we find the former ones to be more suitable to detect the point in time where the joint models stops benefiting from concept drift whereas the latter allows for a more precise detection of the first occurrence of concept drift.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

References

  1. Uber Fares. https://www.kaggle.com/datasets/yasserh/uber-fares-dataset

  2. Bonawitz, K., et al.: Practical secure aggregation for privacy-preserving machine learning. In: proceedings of the 2017 ACM SIGSAC Conference on Computer and Communications Security, pp. 1175–1191 (2017)

    Google Scholar 

  3. Canonaco, G., Bergamasco, A., Mongelluzzo, A., Roveri, M.: Adaptive federated learning in presence of concept drift. In: 2021 International Joint Conference on Neural Networks (IJCNN), pp. 1–7. IEEE (2021)

    Google Scholar 

  4. Casado, F.E., Lema, D., Criado, M.F., Iglesias, R., Regueiro, C.V., Barro, S.: Concept drift detection and adaptation for federated and continual learning. Multimed. Tools App. 81, 1–23 (2022)

    Article  Google Scholar 

  5. Casado, F.E., Lema, D., Iglesias, R., Regueiro, C.V., Barro, S.: Ensemble and continual federated learning for classification tasks. Mach. Learn. 112, 3413–3453 (2023)

    Article  MathSciNet  Google Scholar 

  6. Chen, Y., Chai, Z., Cheng, Y., Rangwala, H.: Asynchronous federated learning for sensor data with concept drift. In: 2021 IEEE International Conference on Big Data (Big Data), pp. 4822–4831. IEEE (2021)

    Google Scholar 

  7. Chow, T., Raza, U., Mavromatis, I., Khan, A.: Flare: detection and mitigation of concept drift for federated learning based IoT deployments. arXiv preprint arXiv:2305.08504 (2023)

  8. Criado, M.F., Casado, F.E., Iglesias, R., Regueiro, C.V., Barro, S.: Non-IID data and continual learning processes in federated learning: a long road ahead. Inf. Fusion 88, 263–280 (2022)

    Article  Google Scholar 

  9. Gama, J., Žliobaitė, I., Bifet, A., Pechenizkiy, M., Bouchachia, A.: A survey on concept drift adaptation. ACM Comput. Surv. (CSUR) 46(4), 1–37 (2014)

    Article  Google Scholar 

  10. Hammoodi, M.S., Stahl, F., Badii, A.: Real-time feature selection technique with concept drift detection using adaptive micro-clusters for data stream mining. Knowl. Based Syst. 161, 205–239 (2018)

    Article  Google Scholar 

  11. Hinder, F., Vaquet, V., Hammer, B.: Suitability of different metric choices for concept drift detection. In: Bouadi, T., Fromont, E., Hüllermeier, E. (eds.) IDA 2022. LNCS, vol. 13205, pp. 157–170. Springer, Cham (2022). https://doi.org/10.1007/978-3-031-01333-1_13

    Chapter  Google Scholar 

  12. Hyndman, R.J., Koehler, A.B.: Another look at measures of forecast accuracy. Int. J. Forecast. 22(4), 679–688 (2006)

    Article  Google Scholar 

  13. Kairouz, P., et al.: Advances and open problems in federated learning. Found. Trends® Mach. Learn. 14(1–2), 1–210 (2021)

    Article  Google Scholar 

  14. Li, Q., Diao, Y., Chen, Q., He, B.: Federated learning on non-IID data silos: an experimental study. In: 2022 IEEE 38th International Conference on Data Engineering (ICDE), pp. 965–978. IEEE (2022)

    Google Scholar 

  15. Li, T., Sahu, A.K., Zaheer, M., Sanjabi, M., Talwalkar, A., Smith, V.: Federated optimization in heterogeneous networks. Proc. Mach. Learn. Syst. 2, 429–450 (2020)

    Google Scholar 

  16. Liu, A., Song, Y., Zhang, G., Lu, J.: Regional concept drift detection and density synchronized drift adaptation. In: IJCAI International Joint Conference on Artificial Intelligence (2017)

    Google Scholar 

  17. Lu, J., Liu, A., Dong, F., Gu, F., Gama, J., Zhang, G.: Learning under concept drift: a review. IEEE TKDE 31(12), 2346–2363 (2018)

    Google Scholar 

  18. Mahgoub, M., Moharram, H., Elkafrawy, P., Awad, A.: Benchmarking concept drift detectors for online machine learning. In: Fournier-Viger, P., Hassan, A., Bellatreche, L. (eds.) Model and Data Engineering. MEDI 2022. LNCS, vol. 13761, pp. 43–57. Springer, Cham (2023). https://doi.org/10.1007/978-3-031-21595-7_4

  19. McMahan, B., Moore, E., Ramage, D., Hampson, S., y Arcas, B.A.: Communication-efficient learning of deep networks from decentralized data. In: Artificial Intelligence and Statistics, pp. 1273–1282. PMLR (2017)

    Google Scholar 

  20. Truex, S., et al.: A hybrid approach to privacy-preserving federated learning. In: Proceedings of the 12th ACM Workshop on Artificial Intelligence and Security, pp. 1–11 (2019)

    Google Scholar 

  21. Wang, D., Shi, S., Zhu, Y., Han, Z.: Federated analytics: opportunities and challenges. IEEE Netw. 36(1), 151–158 (2021)

    Article  Google Scholar 

  22. Yang, Q., Liu, Y., Chen, T., Tong, Y.: Federated machine learning: concept and applications. ACM TIST 10(2), 1–19 (2019)

    Article  Google Scholar 

  23. Yoon, J., Jeong, W., Lee, G., Yang, E., Hwang, S.J.: Federated continual learning with weighted inter-client transfer. In: International Conference on Machine Learning, pp. 12073–12086. PMLR (2021)

    Google Scholar 

  24. Zhang, Z., Sabuncu, M.: Generalized cross entropy loss for training deep neural networks with noisy labels. In: Proceedings of NIPS (2018)

    Google Scholar 

  25. Zhu, J., Ma, X., Blaschko, M.B.: Confidence-aware personalized federated learning via variational expectation maximization. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 24542–24551 (2023)

    Google Scholar 

Download references

Acknowledgements

This research was partially funded by the German Federal Ministry of Health as part of the KINBIOTICS project. The research of Philipp Cimiano is partially funded by the Ministry of Culture and Science of North Rhine-Westphalia under the grant no NW21-059A SAIL.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Christoph Düsing .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Düsing, C., Cimiano, P. (2024). Monitoring Concept Drift in Continuous Federated Learning Platforms. In: Miliou, I., Piatkowski, N., Papapetrou, P. (eds) Advances in Intelligent Data Analysis XXII. IDA 2024. Lecture Notes in Computer Science, vol 14642. Springer, Cham. https://doi.org/10.1007/978-3-031-58553-1_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-58553-1_7

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-58555-5

  • Online ISBN: 978-3-031-58553-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics