Abstract
Federated learning (FL) solves the problem of isolated data islands and improves machine learning models by training on large amounts of data collected from multiple parties. However, the traditional synchronous FL training method will lead to the problem of laggards due to the insufficient computing power of some edge computing devices, which in turn affects the accuracy of training and increases communication overhead. Asynchronous federated learning solves the training problem of traditional FL very well. However, traditional privacy-preserving methods for FL are not suitable for asynchronous federated learning, existing asynchronous training schemes cannot guarantee the privacy of data. Faced with this challenge, this paper proposes a weighted secure asynchronous federated learning scheme, WASecAgg, which can ensure training accuracy and data privacy on the premise of guaranteeing the training efficiency of federated learning. The final experiments show that WASecAgg performs better in terms of training efficiency, training accuracy and privacy-preserving of federated learning.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Chai, D., Wang, L., Chen, K., Yang, Q.: Secure federated matrix factorization. IEEE Intell. Syst. 36(5), 11–20 (2020)
Xie, C., Koyejo, S., Gupta, I.: Asynchronous federated optimization. arXiv preprint arXiv:1903.03934 (2019)
Nguyen, J., et al.: Federated learning with buffered asynchronous aggregation. arXiv preprint arXiv:2106.06639 (2021)
So, J., Ali, R.E., Güler, B., Avestimehr, A.S.: Secure aggregation for buffered asynchronous federated learning. arXiv preprint arXiv:2110.02177 (2021)
Blum, M., Micali, S.: How to generate cryptographically strong sequences of pseudorandom bits. SIAM J. Comput. 13(4), 850–864 (1984)
Yao, A.C.: Theory and application of trapdoor functions. In: 23rd Annual Symposium on Foundations of Computer Science (SFCS 1982), pp. 80–91. IEEE (1982)
Shamir, A.: How to share a secret. Commun. ACM 22(11), 612–613 (1979)
Diffie, W., Hellman, M.E.: New directions in cryptography. In: Secure Communications and Asymmetric Cryptosystems, pp. 143–180. Routledge (2019)
Bonawitz, K., et al.: Practical secure aggregation for privacy-preserving machine learning. In: Proceedings of the 2017 ACM SIGSAC Conference on Computer and Communications Security, pp. 1175–1191 (2017)
McMahan, B., Moore, E., Ramage, D., Hampson, S., y Arcas, B.A.: Communication-efficient learning of deep networks from decentralized data. In: Artificial intelligence and statistics, pp. 1273–1282. PMLR (2017)
Caldas, S., et al.: Leaf: a benchmark for federated settings. arXiv preprint arXiv:1812.01097 (2018)
Jin, H., Yan, N., Mortazavi, M.: Simulating aggregation algorithms for empirical verification of resilient and adaptive federated learning. In: 2020 IEEE/ACM International Conference on Big Data Computing, Applications and Technologies (BDCAT), pp. 124–133. IEEE (2020)
Acknowledgement
This work is supported by the National Natural Science Foundation of China (No. 62072369, 62072371) and the Graduate Innovation Foundation of Xi’an University of Posts and Telecommunications(No.CXJJZL2021025).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Cao, D., Zhang, Y., Liu, W., Wei, X. (2022). Secure Asynchronous Federated Learning for Edge Computing Devices. In: Chen, X., Huang, X., Kutyłowski, M. (eds) Security and Privacy in Social Networks and Big Data. SocialSec 2022. Communications in Computer and Information Science, vol 1663. Springer, Singapore. https://doi.org/10.1007/978-981-19-7242-3_9
Download citation
DOI: https://doi.org/10.1007/978-981-19-7242-3_9
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-19-7241-6
Online ISBN: 978-981-19-7242-3
eBook Packages: Computer ScienceComputer Science (R0)