Skip to main content

A Review of Privacy-Preserving Federated Learning for the Internet-of-Things

  • Chapter
  • First Online:
Federated Learning Systems

Part of the book series: Studies in Computational Intelligence ((SCI,volume 965))

Abstract

The Internet-of-Things (IoT) generates vast quantities of data. Much of this data is attributable to human activities and behavior. Collecting personal data and executing machine learning tasks on this data in a central location presents a significant privacy risk to individuals as well as challenges with communicating this data to the cloud (e.g. where data is particularly large or updated very frequently). Analytics based on machine learning and in particular deep learning benefit greatly from large amounts of data to develop high-performance predictive models. This work reviews federated learning (FL) as an approach for performing machine learning on distributed data to protect the privacy of user-generated data. We highlight pertinent challenges in an IoT context such as reducing communication costs associated with data transmission, learning from data under heterogeneous conditions, and applying additional privacy protections to FL. Throughout this review, we identify the strengths and weaknesses of different methods applied to FL, and finally, we outline future directions for privacy-preserving FL research, particularly focusing on IoT applications.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 139.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 179.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 179.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    http://www.image-net.org/.

  2. 2.

    http://yann.lecun.com/exdb/mnist/.

  3. 3.

    https://www.gutenberg.org/ebooks/100.

  4. 4.

    https://www.cs.toronto.edu/~kriz/cifar.html.

  5. 5.

    http://ufldl.stanford.edu/housenumbers/.

  6. 6.

    https://www.kaggle.com/zalando-research/fashionmnist.

  7. 7.

    https://www.netflixprize.com/index.html.

  8. 8.

    https://www.imdb.com/.

  9. 9.

    https://github.com/shaih/HElib.

  10. 10.

    https://www.tensorflow.org/.

References

  1. Gartner, Gartner identifies top 10 strategic IoT technologies and trends (2018). https://www.gartner.com/en/newsroom/press-releases/2018-11-07-gartner-identifies-top-10-strategic-iot-technologies-and-trends

  2. J.A. Stankovic, Research directions for the internet of things. IEEE Internet Things J. 1(1), 3–9 (2014)

    Article  Google Scholar 

  3. F. Bonomi, R. Milito, J. Zhu, S. Addepalli, Fog computing and its role in the internet of things, in MCC Workshop on SIGCOMM 2012, August (ACM, New York, USA, 2012), pp. 13–16

    Google Scholar 

  4. Y. Ai, M. Peng, K. Zhang, Edge computing technologies for internet of things: a primer. Digit. Commun. Netw. 4(2), 77–86 (2018)

    Article  Google Scholar 

  5. L. Bittencourt, R. Immich, R. Sakellariou, N. Fonseca, E. Madeira, M. Curado, L. Villas, L. DaSilva, C. Lee, O. Rana, The internet of things, fog and cloud continuum: integration and challenges. Internet Things 3–4, 134–155 (2018)

    Article  Google Scholar 

  6. OpenFog Consortium, OpenFog reference architecture for fog computing (2017). https://www.openfogconsortium.org/wp-content/uploads/OpenFog_Reference_Architecture_2_09_17-FINAL.pdf

  7. F. Bonomi, R. Milito, P. Natarajan, J. Zhu, Fog computing: a platform for internet of things and analytics in Big Data and Internet of Things: A Roadmap for Smart Environments (Springer, Cham, 2014), pp. 169–186

    Google Scholar 

  8. H. Li, K. Ota, M. Dong, Learning IoT in edge: deep learning for the internet of things with edge computing. IEEE Netw. 32(1), 96–101 (2018)

    Article  Google Scholar 

  9. T. Ben-Nun, T. Hoefler, Demystifying parallel and distributed deep learning. ACM Comput. Surv. (CSUR) 52(4), 1–43 (2019)

    Article  Google Scholar 

  10. B. McMahan, E. Moore, D. Ramage, S. Hampson, B.A. y Arcas, Communication-efficient learning of deep networks from decentralized data, in Artificial Intelligence and Statistics (2017), pp. 1273–1282

    Google Scholar 

  11. M. Abadi, A. Chu, I. Goodfellow, H.B. McMahan, I. Mironov, K. Talwar, L. Zhang, Deep learning with differential privacy, in ICLR, October (ACM, 2016), pp. 308–318

    Google Scholar 

  12. R. Shokri, V. Shmatikov, Privacy-preserving deep learning, in The 22nd ACM SIGSAC Conference, October (ACM, 2015), pp. 1310–1321

    Google Scholar 

  13. S. Wang, T. Tuor, T. Salonidis, K.K. Leung, C. Makaya, T. He, K. Chan, Adaptive federated learning in resource constrained edge computing systems. IEEE J. Sel. Areas Commun. 37(6), 1205–1221 (2019)

    Article  Google Scholar 

  14. J. Dean, G.S. Corrado, R. Monga, K. Chen, M. Devin, Q.V. Le, M.Z. Mao, M. Ranzato, A. Senior, P. Tucker, K. Yang, A.Y. Ng, Large scale distributed deep networks, in Advances in Neural Information Processing Systems, December (Curran Associates Inc., 2012), pp. 1223–1231

    Google Scholar 

  15. M. Li, D.G. Andersen, J.W. Park, A.J. Smola, A. Ahmed, Scaling distributed machine learning with the parameter server, in OSDI, vol. 14, pp. 583–598 (2014)

    Google Scholar 

  16. Y. Lecun, Y. Bengio, G.E. Hinton, Deep learning. Nature 521(7), 436–444 (2015)

    Article  Google Scholar 

  17. I. Goodfellow, Y. Bengio, A. Courville, Deep Learning, 1st edn. (MIT Press, Cambridge, 2016)

    MATH  Google Scholar 

  18. D.E. Rumelhart, G.E. Hinton, R.J. Williams, Learning internal representations by error propagation, in Parallel Distributed Processing Explorations in the Microstructure of Cognition (MIT Press, Cambridge, 1986), pp. 318–362

    Google Scholar 

  19. A. Krizhevsky, I. Sutskever, G.E. Hinton, ImageNet classification with deep convolutional neural networks. Commun. ACM 60(6), 84–90 (2017)

    Article  Google Scholar 

  20. T.M. Chilimbi, Y. Suzue, J. Apacible, K. Kalyanaraman, Project Adam: building an efficient and scalable deep learning training system, in OSDI (2014), pp. 571–582

    Google Scholar 

  21. B. Recht, C. Ré, S. Wright, F. Niu, Hogwild: a lock-free approach to parallelizing stochastic gradient descent, in Advances in Neural Information Processing Systems (2011), pp. 693–701

    Google Scholar 

  22. Q. Ho, J. Cipar, H. Cui, S. Lee, J.K. Kim, P.B. Gibbons, G.A. Gibson, G. Ganger, E.P. Xing, More effective distributed ML via a stale synchronous parallel parameter server, in Advances in Neural Information Processing Systems (2013), pp. 1223–1231

    Google Scholar 

  23. A. Odena, Faster asynchronous SGD (2016), arXiv:1601.04033

  24. W. Zhang, S. Gupta, X. Lian, J. Liu, Staleness-aware async-SGD for distributed deep learning, in Proceedings of the 25th International Joint Conference on Artificial Intelligence, November (2015), pp. 2350–2356

    Google Scholar 

  25. K. Hsieh, A. Harlap, N. Vijaykumar, D. Konomis, G.R. Ganger, P.B. Gibbons, Gaia: geo-distributed machine learning approaching LAN speeds, in NSDI (2017), pp. 629–647

    Google Scholar 

  26. J. Jiang, B. Cui, C. Zhang, L. Yu, Heterogeneity-aware distributed parameter servers, in 2017 ACM International Conference (ACM Press, New York, USA, 2017), pp. 463–478

    Google Scholar 

  27. J. Daily, A. Vishnu, C. Siegel, T. Warfel, V. Amatya, GossipGraD: scalable deep learning using gossip communication based asynchronous gradient descent (2018), arXiv:1803.05880

  28. P.H. Jin, Q. Yuan, F. Iandola, K. Keutzer, How to scale distributed deep learning? (2016), arXiv:1611.04581

  29. S. Sundhar Ram, A. Nedic, V.V. Veeravalli, Asynchronous gossip algorithms for stochastic optimization, in 2009 International Conference on Game Theory for Networks (GameNets) (IEEE, 2009), pp. 80–81

    Google Scholar 

  30. J. Ba, R. Caruana, Do deep nets really need to be deep? in Advances in Neural Information Processing Systems (2014), pp. 2654–2662

    Google Scholar 

  31. Y. Chebotar, A. Waters, Distilling knowledge from ensembles of neural networks for speech recognition, in Interspeech 2016 (ISCA, 2016), pp. 3439–3443

    Google Scholar 

  32. G.E. Hinton, O. Vinyals, J. Dean, Distilling the knowledge in a neural network (2015), arXiv:1503.02531

  33. Y. Liang, M.F. Balcan, V. Kanchanapally, Distributed PCA and k-means clustering, in The Big Learning Workshop at NIPS (2013)

    Google Scholar 

  34. J. Konečný, H.B. McMahan, D. Ramage, P. Richtárik, Federated optimization: distributed machine learning for on-device intelligence (2016), arXiv:1610.02527

  35. R. Johnson, T. Zhang, Accelerating stochastic gradient descent using predictive variance reduction, in Proceedings of the 26th International Conference on Neural Information Processing Systems, vol. 1 (Curran Associates Inc., USA, 2013), pp. 315–323

    Google Scholar 

  36. J. Konečný, H.B. McMahan, F.X. Yu, P. Richtárik, A.T. Suresh, D. Bacon, Federated learning: strategies for improving communication efficiency, in NIPS Workshop on Private Multi-party Machine Learning (2016)

    Google Scholar 

  37. V. Smith, C.-K. Chiang, M. Sanjabi, A. Talwalkar, Federated multi-task learning, in Advances in Neural Information Processing Systems (2017), arXiv:1705.10467

  38. E. Bagdasaryan, A. Veit, Y. Hua, D. Estrin, V. Shmatikov, How to backdoor federated learning (2018), arXiv:1807.00459

  39. C. Fung, C.J.M. Yoon, I. Beschastnikh, Mitigating sybils in federated learning poisoning (2018), arXiv:1808.04866

  40. A. Hard, K. Rao, R. Mathews, F. Beaufays, S. Augenstein, H. Eichner, C. Kiddon, D. Ramage, Federated learning for mobile keyboard prediction (2018), http://arxiv.org

  41. D. Leroy, A. Coucke, T. Lavril, T. Gisselbrecht, J. Dureau, Federated learning for keyword spotting (2018), arXiv:1810.05512

  42. D. Liu, T. Miller, R. Sayeed, K.D. Mandl, FADL: federated-autonomous deep learning for distributed electronic health record (2018), arXiv:1811.11400

  43. T. Nishio, R. Yonetani, Client selection for federated learning with heterogeneous resources in mobile edge (2018), arXiv:1804.08333

  44. Y. Zhao, M. Li, L. Lai, N. Suda, D. Civin, V. Chandra, Federated learning with non-IID data (2018), arXiv:1806.00582

  45. I. Dhillon, D. Papailiopoulos, V. Sze (eds.), Federated optimization in heterogeneous networks (2020)

    Google Scholar 

  46. H. Eichner, T. Koren, H.B. McMahan, N. Srebro, K. Talwar, Semi-cyclic stochastic gradient descent, in International Conference on Machine Learning, April (2019), arXiv:1904.10120

  47. S.P. Karimireddy, S. Kale, M. Mohri, S.J. Reddi, S.U. Stich, A.T. Suresh, Scaffold: stochastic controlled averaging for federated learning (2020). arXiv preprint arXiv:1910.06378

  48. F. Sattler, S. Wiedemann, K.-R. Müller, W. Samek, Robust and communication-efficient federated learning from non-IID data (2019), arXiv:1903.02891

  49. C. Briggs, Z. Fan, P. Andras, Federated learning with hierarchical clustering of local updates to improve training on non-IID data, in 2020 International Joint Conference on Neural Networks (IJCNN) (IEEE, 2020)

    Google Scholar 

  50. Analyzing federated Learning through an adversarial lens, in PMLR, May (2019)

    Google Scholar 

  51. S. Han, H. Mao, W.J. Dally, Deep compression: compressing deep neural networks with pruning, trained quantization and Huffman coding (2015), arXiv:1510.00149

  52. S. Han, J. Pool, J. Tran, W. Dally, Learning both weights and connections for efficient neural network, in Advances in Neural Information Processing Systems (2015), pp. 1135–1143

    Google Scholar 

  53. S. Gupta, A. Agrawal, K. Gopalakrishnan, P. Narayanan, Deep learning with limited numerical precision (2015), arXiv:1502.02551

  54. T. Dettmers, 8-bit approximations for parallelism in deep learning (2015), arXiv:1511.04561

  55. F. Seide, H. Fu, J. Droppo, G. Li, D. Yu, 1-bit stochastic gradient descent and its application to data-parallel distributed training of speech DNNs, in 15th Annual Conference of the International Speech Communication Association (2014)

    Google Scholar 

  56. C. Hardy, E. Le Merrer, B. Sericola, Distributed deep learning on edge-devices: feasibility via adaptive compression, in 2017 IEEE 16th International Symposium on Network Computing and Applications (NCA) (IEEE), pp. 1–8

    Google Scholar 

  57. Y. Lin, S. Han, H. Mao, Y. Wang, W.J. Dally, Deep gradient compression: reducing the communication bandwidth for distributed training (2017), arXiv:1712.01887

  58. B. Liu, L. Wang, M. Liu, C.-Z. Xu, Federated imitation learning: a novel framework for cloud robotic systems with heterogeneous sensor data. IEEE Robot. Autom. Lett. 5(2), 3509–3516 (2020)

    Article  Google Scholar 

  59. W. Zhou, Y. Li, S. Chen, B. Ding, Real-time data processing architecture for multi-robots based on differential federated learning, in 2018 IEEE SmartWorld, Ubiquitous Intelligence & Computing, Advanced & Trusted Computing, Scalable Computing & Communications, Cloud & Big Data Computing, Internet of People and Smart City Innovation (SmartWorld/SCALCOM/UIC/ATC/CBDCom/IOP/SCI) (IEEE, 2018), pp. 462–471

    Google Scholar 

  60. R. Fantacci, B. Picano, Federated learning framework for mobile edge computing networks. CAAI Trans. Intell. Technol. 5(1), 15–21 (2020)

    Article  Google Scholar 

  61. Z. Yu, J. Hu, G. Min, H. Lu, Z. Zhao, H. Wang, N. Georgalas, Federated learning based proactive content caching in edge computing, in GLOBECOM 2018–2018 IEEE Global Communications Conference (IEEE, 2018), pp. 1–6

    Google Scholar 

  62. Y. Lu, X. Huang, Y. Dai, S. Maharjan, Y. Zhang, Differentially private asynchronous federated learning for mobile edge computing in urban informatics. IEEE Trans. Ind. Inform. 16(3), 2134–2143 (2019)

    Article  Google Scholar 

  63. D. Ye, R. Yu, M. Pan, Z. Han, Federated learning in vehicular edge computing: a selective model aggregation approach. IEEE Access 8, 23 920–23 935 (2020)

    Google Scholar 

  64. Y.M. Saputra, D.T. Hoang, D.N. Nguyen, E. Dutkiewicz, M.D. Mueck, S. Srikanteswara, Energy demand prediction with federated learning for electric vehicle networks, in GLOBECOM 2019–2019 IEEE Global Communications Conference (IEEE, 2019), pp. 1–6

    Google Scholar 

  65. T.D. Nguyen, S. Marchal, M. Miettinen, H. Fereidooni, N. Asokan, A.-R. Sadeghi, “D\(\ddot{I}\)oT: a federated self-learning anomaly detection system for IoT, in 2019 IEEE 39th International Conference on Distributed Computing Systems (ICDCS) (IEEE, 2019), pp. 756–767

    Google Scholar 

  66. N.I. Mowla, N.H. Tran, I. Doh, K. Chae, Federated learning-based cognitive detection of jamming attack in flying ad-hoc network. IEEE Access 8, 4338–4350 (2020)

    Article  Google Scholar 

  67. M. Chen, O. Semiari, W. Saad, X. Liu, C. Yin, Federated echo state learning for minimizing breaks in presence in wireless virtual reality networks. IEEE Trans. Wirel. Commun. 19(1), 177–191 (2020)

    Article  Google Scholar 

  68. K. Sozinov, V. Vlassov, S. Girdzijauskas, Human activity recognition using federated learning, in 2018 IEEE International Conference on Parallel & Distributed Processing with Applications, Ubiquitous Computing & Communications, Big Data & Cloud Computing, Social Computing & Networking, Sustainable Computing & Communications (ISPA/IUCC/BDCloud/SocialCom/SustainCom) (IEEE, 2018), pp. 1103–1111

    Google Scholar 

  69. R. Miotto, F. Wang, S. Wang, X. Jiang, J.T. Dudley, Deep learning for healthcare: review, opportunities and challenges. Brief. Bioinform. 19(6), 1236–1246 (2017)

    Article  Google Scholar 

  70. A. Gandomi, M. Haider, Beyond the hype: big data concepts, methods, and analytics. Int. J. Inf. Manag. 35(2), 137–144 (2015)

    Article  Google Scholar 

  71. UN General Assembly, Universal Declaration of Human Rights (2015)

    Google Scholar 

  72. European Commision, Regulation (EU) 2016/679 of the European parliament and of the council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing directive 95/46/EC (General Data Protection Regulation). Off. J. Eur. Union L119, 1–88 (2016)

    Google Scholar 

  73. C. Dwork, Differential privacy, in Automata, Languages and Programming (Springer, Berlin, 2006), pp. 1–12

    Google Scholar 

  74. B.C.M. Fung, K. Wang, R. Chen, P.S. Yu, Privacy-preserving data publishing: a survey of recent developments. ACM Comput. Surv. (CSUR) 42(4), 14–53 (2010)

    Article  Google Scholar 

  75. H.T. Greely, The uneasy ethical and legal underpinnings of large-scale genomic biobanks. Annu. Rev. Genomics Hum. Genet. 8(1), 343–364 (2007)

    Article  Google Scholar 

  76. A. Narayanan, V. Shmatikov, Robust de-anonymization of large sparse datasets, in 2008 IEEE Symposium on Security and Privacy (SP 2008) (IEEE, 2008), pp. 111–125

    Google Scholar 

  77. A. Tockar riding with the stars: passenger privacy in the NYC taxicab dataset (2014), https://research.neustar.biz/2014/09/15/riding-with-the-stars-passenger-privacy-in-the-nyc-taxicab-dataset/

  78. L. Sweeney, K-anonymity: a model for protecting privacy. Int. J. Uncertain. Fuzziness Knowl.-Based Syst. 10(5), 557–570 (2002)

    Google Scholar 

  79. A. Machanavajjhala, J. Gehrke, D. Kifer, M. Venkitasubramaniam, L-diversity: privacy beyond k-anonymity, in 22nd International Conference on Data Engineering (IEEE, 2006), pp. 24–24

    Google Scholar 

  80. N. Li, T. Li, S. Venkatasubramanian, t-closeness: privacy beyond k-anonymity and l-diversity, in 2007 IEEE 23rd International Conference on Data Engineering (IEEE, 2007), pp. 106–115

    Google Scholar 

  81. C. Gentry, Computing arbitrary functions of encrypted data. Commun. ACM 53(3), 97–105 (2010)

    Article  Google Scholar 

  82. A. Acar, H. Aksu, A.S. Uluagac, M. Conti, A survey on homomorphic encryption schemes: theory and implementation. ACM Comput. Surv. (CSUR) 51(4), 1–35 (2018)

    Article  Google Scholar 

  83. R. Gilad-Bachrach, N. Dowlin, K. Laine, K. Lauter, M. Naehrig, J. Wernsing, Cryptonets: applying neural networks to encrypted data with high throughput and accuracy, in International Conference on Machine Learning (2016), pp. 201–210

    Google Scholar 

  84. E. Hesamifard, H. Takabi, M. Ghasemi, CryptoDL: deep neural networks over encrypted data (2017), arXiv:1711.05189

  85. L. Rist, Encrypt your machine learning (2018), https://medium.com/corti-ai/encrypt-your-machine-learning-12b113c879d6

  86. Y. Du, L. Gustafson, D. Huang, K. Peterson, Implementing ML algorithms with HE. MIT Course 6.857: Computer and Network Security (2017)

    Google Scholar 

  87. E. Chou, J. Beal, D. Levy, S. Yeung, A. Haque, L. Fei-Fei, Faster cryptoNets: leveraging sparsity for real-world encrypted inference (2018), arXiv:1811.09953

  88. O. Goldreich, Secure multi-party computation. Manuscript. Preliminary version, vol. 78 (1998)

    Google Scholar 

  89. A. Shamir, How to share a secret. Commun. ACM 22(11), 612–613 (1979)

    Article  MathSciNet  Google Scholar 

  90. J. Launchbury, D. Archer, T. DuBuisson, E. Mertens, Application-scale secure multiparty computation, in Programming Languages and Systems, April (Springer, Berlin, 2014), pp. 8–26

    Google Scholar 

  91. C. Dwork, A. Roth, The algorithmic foundations of differential privacy. Found. Trends Theor. Comput. Sci. 9(3–4), 211–407 (2014)

    MathSciNet  MATH  Google Scholar 

  92. Q. Geng, P. Kairouz, S. Oh, P. Viswanath, The staircase mechanism in differential privacy. IEEE J. Sel. Top. Signal Process. 9(7), 1176–1184 (2015)

    Article  Google Scholar 

  93. S.P. Kasiviswanathan, A. Smith, On the ‘semantics’ of differential privacy: a bayesian formulation. J. Priv. Confid. 6(1) (2014)

    Google Scholar 

  94. T. Zhu, G. Li, W. Zhou, P.S. Yu, Preliminary of differential privacy, in Differential Privacy and Applications (Springer International Publishing, Cham, 2017), pp. 7–16

    Google Scholar 

  95. J. Lee, C. Clifton, How much is enough? Choosing \(\varepsilon \) for differential privacy, in Information Security, October (Springer, Berlin, 2011), pp. 325–340

    Google Scholar 

  96. S.L. Warner, Randomized response: a survey technique for eliminating evasive answer bias. J. Am. Stat. Assoc. 60(309), 63 (1965)

    Article  Google Scholar 

  97. R.C. Geyer, T. Klein, M. Nabi, Differentially private federated learning: a client level perspective (2017), arXiv:1712.07557

  98. Apple, Apple differential privacy technical overview (2017). https://www.apple.com/privacy/docs/Differential_Privacy_Overview.pdf

  99. A.G. Thakurta, Differential privacy: from theory to deployment, in USENIX Association (USENIX Association, Vancouver, 2017)

    Google Scholar 

  100. Ú. Erlingsson, V. Pihur, A. Korolova, “RAPPOR: randomized aggregatable privacy-preserving ordinal response, in Proceedings of the ACM SIGSAC Conference on Computer and Communications Security, November (ACM, 2014), pp. 1054–1067

    Google Scholar 

  101. H.B. McMahan, D. Ramage, K. Talwar, L. Zhang, learning differentially private recurrent language models, in ICLR (2018)

    Google Scholar 

  102. K. Bonawitz, V. Ivanov, B. Kreuter, A. Marcedone, H.B. McMahan, S. Patel, D. Ramage, A. Segal, K. Seth, Practical secure aggregation for privacy-preserving machine learning, in Proceedings of the ACM SIGSAC Conference on Computer and Communications Security, October (ACM, 2017), pp. 1175–1191

    Google Scholar 

  103. B. Hitaj, G. Ateniese, F. Perez-Cruz, Deep models under the GAN, in 2017ACM SIGSAC Conference (ACM Press, New York, USA, 2017), pp. 603–618

    Google Scholar 

  104. X. Zhang, S. Ji, H. Wang, T. Wang, Private, yet practical, multiparty deep learning, in 2017 IEEE 37th International Conference on Distributed Computing Systems (ICDCS) (IEEE, 2017), pp. 1442–1452

    Google Scholar 

  105. K. Bonawitz, H. Eichner, W. Grieskamp, D. Huba, A. Ingerman, V. Ivanov, C. Kiddon, J. Konečný, S. Mazzocchi, H.B. McMahan, T. Van Overveldt, D. Petrou, D. Ramage, J. Roselander, Towards federated learning at scale: system design (2019), arXiv:1902.01046

  106. N. Papernot, M. Abadi, Ú. Erlingsson, I. Goodfellow, K. Talwar, Semi-supervised knowledge transfer for deep learning from private training data (2016), arXiv:1610.05755

  107. N. Papernot, S. Song, I. Mironov, A. Raghunathan, K. Talwar, Ú. Erlingsson, Scalable private learning with PATE (2018), arXiv:1802.08908

Download references

Acknowledgements

This work is partly supported by the SEND project (grant ref. 32R16P00706) funded by ERDF and BEIS.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Christopher Briggs .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Briggs, C., Fan, Z., Andras, P. (2021). A Review of Privacy-Preserving Federated Learning for the Internet-of-Things. In: Rehman, M.H.u., Gaber, M.M. (eds) Federated Learning Systems. Studies in Computational Intelligence, vol 965. Springer, Cham. https://doi.org/10.1007/978-3-030-70604-3_2

Download citation

Publish with us

Policies and ethics