Skip to main content
Log in

Cybersecurity certification of Artificial Intelligence: a missed opportunity to coordinate between the Artificial Intelligence Act and the Cybersecurity Act

  • Published:
International Cybersecurity Law Review Aims and scope Submit manuscript

A Correction to this article was published on 20 April 2022

This article has been updated

Abstract

In April 2021, the Commission published a draft proposal for a regulation on artificial intelligence (AI) systems aimed at striking a balance between the market need for a competitive and dynamic ecosystem and the need to minimise risks to the safety and fundamental rights of users and citizens. Among the set of obligations that apply to high-risk AI technologies, the AI Act includes a specific provision addressing the security and robustness of AI systems. This provision overlaps with existing legislation addressing cybersecurity, namely the certification process defined in Regulation 2019/881 on the European Union Agency for Cybersecurity and on information and communication technology cybersecurity certification. Although the AI Act hints at a possible path towards mutual recognition of certifications, a deeper analysis of the provisions and a comparison between the underlying features of the certification mechanisms show that the different approaches adopted in the two acts may undermine the goal of certification mechanisms as trust-enhancing and transparency instruments. As a result, this paper provides evidence of the missed opportunity for the AIA proposal to link and coordinate in a more structured way with the cybersecurity framework.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

Change history

Notes

  1. See the first reactions to the AIA Proposal by academics and civil society: Burri and von Bothmer (2021); Glauner (2021); Greenleaf (2021); Ebers (2021); BEUC (2021); EDRi (2021); ECNL (2021a).

  2. European Commission (2018).

  3. European Commission (2020a).

  4. European Commission (2020b).

  5. European Commission (2020c).

  6. European Commission (2020d).

  7. Note that the distinction is presented in the Explanatory memorandum to AIA, p. 12. See also Vaele and Zuiderveen Borgesius (2021, p. 3).

  8. See Art. 5 AIA.

  9. The definition of high-risk technologies includes a non-exhaustive list provided in Annex III of the AIA. The annex lists inter alia biometric identification and categorisation of natural persons, AI used in law enforcement activities, in migration, asylum and border control management, and in the administration of justice and democratic processes. Moreover, the AIA leaves open the opportunity for further developments of technologies, affirming that the list may include all the AI that may “pose a risk of harm to health and safety, or a risk of adverse impact on fundamental rights” (Art. 7(1) lit. b).

  10. See Art. 9 AIA.

  11. See Art. 10 AIA.

  12. See Art. 11 and 12 AIA.

  13. See Art. 14 AIA.

  14. See Art. 15 AIA.

  15. See Art. 43 AIA.

  16. See ECNL (2021b); EDRi (2021) and Article 19 (2021); Access now (2021).

  17. Note that Art. 29(6) AIA requires a data protection impact assessment, although it does not cover the larger number of fundamental rights that could be affected by AI decisions. See Access now (2021, p. 22); ECNL (2021b).

  18. Regulation (EU) 2019/881 on ENISA and on information and communications technology cybersecurity certification and repealing Regulation (EU) No 526/2013 (Cybersecurity Act).

  19. The paper does not claim to provide a systematic analysis of AIA certification frameworks, which means that some procedural or other aspects are not included in the analysis. In particular, analysis of the market control mechanisms in Arts. 61–68 AIA is not addressed in the paper.

  20. For an analysis of the different definitions of certification, see Lachaud (2019).

  21. See Directive 2009/48/EC on the safety of toys. The directive provides a set of standards addressing physical and mechanical properties, flammability, chemical properties and electrical properties in order for toys to receive the CE certification mark.

  22. See the case of the Forest Stewardship certification, which provides a voluntary process for verifying responsible forest practices that include not only taking into account the environmental impact of logging activities and the maintenance of the ecological functions and integrity of forests but also includes criteria regarding recognition and respect for indigenous people’s rights, respect for worker’s rights in compliance with the International Labour Organisation (ILO) convention and equitable use and sharing of benefits derived from forests.

  23. See the voluntary due diligence requirements provided in Regulation (EU) 2017/821 laying down supply chain due diligence obligations for Union importers of tin, tantalum and tungsten, their ores and gold originating from conflict-affected and high-risk areas. The regulation imposes an obligation on EU importers of tin, tantalum, tungsten and gold to verify whether goods purchased from third countries contributed to forced labour or other illicit activities.

  24. Third party assessment is performed by a party different to the organisation that seeks certification (first party) and different to the entity requiring the organisation to be certified (second party). For a detailed description of different types of certification, see Daskalova and Heldeweg (2019).

  25. See ISO/IEC 17000:2004 subclause 5.5. Note that ISO identifies different methods of conformity assessment such as testing, inspection (usually onsite assessments of the conformity of product samples or their production processes), sampling and audit (regarding the conformity of management systems).

  26. Note that the attestation of conformity can have a time limitation, which usually ranges from 5 to 10 years, in order to take into account subsequent sector-specific developments. In some cases, the certification can be renewed if conformity has been maintained for an equal duration.

  27. See the first formulation of this approach to the market in Viscusi (1978).

  28. See Regulation (EC) No 765/2008 setting out the requirements for accreditation and market surveillance relating to the marketing of products and repealing Regulation (EEC) No. 339/93. Note that Art. 49 AIA provides that the CE marking can be affixed visibly, legibly and indelibly for high-risk AI systems that comply with the requirements provided in the proposed AIA.

  29. See Regulation (EU) 2018/848 on organic production and labelling of organic products and repealing Council Regulation (EC) No 834/2007.

  30. See Art. 42 of Regulation (EU) 2016/679 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation). On the GDPR certification scheme, see Lachaud (2018); Rodrigues et al. (2016); Hornung and Bauer (2019).

  31. See Regulation (EU) 2019/881 on ENISA and on information and communications technology cybersecurity certification and repealing Regulation (EU) No 526/2013.

  32. See also European Commission (2013).

  33. A well-known case was Wannacry, which was a ransomware cyberattack that affected computers running a Microsoft Windows OS in May 2017. For more, see https://en.wikipedia.org/wiki/WannaCry_ransomware_attack, accessed 11 October 2021. The attack was able to infect 200,000 computers in 150 countries, with an economic impact of around $ 4 billion.

  34. Recital 69 CSA.

  35. Weber and Studer (2016).

  36. Kamara (2020).

  37. A non-exhaustive list of requirements is included in Art. 54 CSA.

  38. See the example of the draft version of the Certification Scheme for Cloud Services, ENISA (2020a).

  39. According to Art. 22 CSA, the SCCG is composed of members selected from recognised experts representing relevant stakeholders (e.g. standardisation bodies, producers, providers, consumer associations, etc.). See the current members at https://digital-strategy.ec.europa.eu/en/library/stakeholder-cybersecurity-certification-group, accessed 11 October 2021.

  40. See the SCCG’s opinion on the Draft Union Rolling Work Programme adopted 19-02-2021, available at https://digital-strategy.ec.europa.eu/en/library/stakeholder-cybersecurity-certification-group, accessed 11 October 2021.

  41. According to Rec. 59 CSA, the members of the ad hoc working groups are selected according to the highest standards of expertise, aiming to ensure an appropriate balance according to the specific issues in question between the public administrations of the Member States, Union institutions, bodies, offices and agencies, and the private sector, including industry, users and academic experts in network and information security.

    For instance, the Ad Hoc Working Group set up for the preparation of a candidate EU cybersecurity certification scheme on cloud services was composed of 20 industry representatives (e.g. cloud service providers, cloud service customers, conformity assessment bodies) and 12 representatives from accreditation bodies and EU Member States. See ENISA (2020a, p. 4).

  42. See Art. 52 (6) and (7) CSA.

  43. See ENISA (2020a, pp. 19–20).

  44. Note that Art. 53(1) allows also for a conformity self-assessment to be carried out by the manufacturer or provider itself. However, this option is available only for the basic assurance level.

  45. See Art. 58 (8) (b) CSA.

  46. See Art. 58 (8) (f) CSA.

  47. In addition, the requirement for a peer review mechanism which subjects all national cybersecurity authorities to assessments by their ‘peers’, i.e. the competent authorities of other Member States, is a further safeguard against forum shopping.

  48. See Art. 30 AIA.

  49. See Art. 44 AIA. Note that the certificates may be valid for a maximum of five years and can be renewed based on a re-assessment in accordance with the conformity assessment procedures applicable.

  50. See Art. 61 AIA.

  51. See the Explanatory Memorandum, p. 8.

  52. The High-level expert group on artificial intelligence is a task force appointed by the European Commission to provide advice on its artificial intelligence strategy. For more, see https://digital-strategy.ec.europa.eu/en/policies/expert-group-ai, accessed 11 October 2021.

  53. The AI Alliance is a multi-stakeholder forum launched in June 2018. For more, see https://ec.europa.eu/digital-single-market/en/european-ai-alliance, accessed 11 October 2021.

  54. See the detailed list of stakeholder events and activities in Annex 2 of the Regulation Impact Assessment.

  55. Rec. 61 AIA points to the importance of standardisation and the application of Regulation (EU) No 1025/2012 on European standardisation.

  56. Note that in March 2021, CEN and CENELEC established a new Joint Technical Committee 21 on ‘Artificial Intelligence’, which will be in charge of developing standards for AI. See https://www.cencenelec.eu/news/brief_news/Pages/TN-2021-013.aspx, accessed 11 October 2021.

  57. Art. 41(1) AIA clarifies that the Commission will adopt the common specification in accordance with the examination procedure ex Art. 5 of Regulation 182/2011.

  58. See Art. 41(2) AIA.

  59. See Art. 58 AIA. Note that the composition of the Board includes only representatives of the national regulatory authorities, with the possibility of involving external experts and observers only on request. Therefore, the European AI board also does not adopt a more inclusive approach to potential stakeholders.

  60. See Art. 43(1) AIA.

  61. As previously mentioned, Art. 40 AIA acknowledges that compliance with harmonised standards should be presumed to be in conformity with the regulation’s requirements.

  62. Note that point 4.5 of Annex IV AIA provides that on a reasoned request the notified authority shall also be granted access to the source code of the AI system.

  63. Moreover, AI can play a major role in cybersecurity not only as a subject, i.e. as a technology that should be made responsive, robust and resilient to cyberthreats, but also as a tool able to ensure the responsiveness, robustness and resilience of other technologies. See Taddeo, McCutcheon and Floridi (2019).

  64. See also rec. 51 AIA, affirming “Cyberattacks against AI systems can leverage AI specific assets, such as training data sets (e.g. data poisoning) or trained models (e.g. adversarial attacks), or exploit vulnerabilities in the AI system’s digital assets or the underlying ICT infrastructure. To ensure a level of cybersecurity appropriate to the risks, suitable measures should therefore be taken by the providers of high-risk AI systems, also taking into account as appropriate the underlying ICT infrastructure”.

  65. Note that ENISA (2020b), on the other hand, acknowledges the interdependencies between AI and cybersecurity, distinguishing three main dimensions: cybersecurity for AI; AI to support cybersecurity; and malicious use of AI.

  66. See ENISA (2020b, p. 27).

  67. Kamara (2020).

References

Cited literature

  • Access now (2021) Access now’s submission to the European Commission’s adoption consultation on the artificial intelligence act. https://www.accessnow.org/cms/assets/uploads/2021/08/Submission-to-the-European-Commissions-Consultation-on-the-Artificial-Intelligence-Act.pdf. Accessed 11 Oct 2021

  • Article 19 (2021) EU: New proposal on artificial intelligence must protect human rights. https://www.article19.org/resources/eu-artificial-intelligence-and-human-rights/. Accessed 11 Oct 2021

  • BEUC (2021) Regulating AI to protect consumers—Position paper on the AI act. https://www.beuc.eu/publications/beuc-x-2021-088_regulating_ai_to_protect_the_consumer.pdf. Accessed 11 Oct 2021

  • Burri T, von Bothmer F (2021) The new EU legislation on artificial intelligence: a primer. https://ssrn.com/abstract=3831424. Accessed 11 Oct 2021

  • Daskalova V, Heldeweg M (2019) Challenges for responsible certification in institutional context: the case of competition law enforcement in markets with certification. In: Rott P (ed) Certification—Trust, accountability, liability. Springer, Cham, pp 23–71

    Chapter  Google Scholar 

  • Ebers M (2021) Standardizing AI—The case of the European Commission’s proposal for an artificial intelligence act, in the Cambridge handbook of artificial intelligence: global perspectives on law and ethics. https://ssrn.com/abstract=3900378. Accessed 11 Oct 2021

  • ECNL (2021a) ECNL position statement on the EU AI Act. https://ecnl.org/news/ecnl-position-statement-eu-ai-act. Accessed 11 Oct 2021

  • ECNL (2021b) Evaluating the risk of AI systems to human rights from a tier-based approach. https://ecnl.org/news/evaluating-risk-ai-systems-human-rights-tier-based-approach. Accessed 11 Oct 2021

  • EDRi (2021) European Commission adoption consultation: Artificial intelligence act. https://edri.org/our-work/edri-submits-response-to-the-european-commission-ai-adoption-consultation/. Accessed 11 Oct 2021

  • ENISA (2020a) European Cybersecurity (candidate) certification scheme for cloud services published in December 2020 subject to public consultation. https://www.enisa.europa.eu/publications/eucs-cloud-service-scheme. Accessed 11 Oct 2021

  • ENISA (2020b) AI cybersecurity challenges—Threat landscape for artificial intelligence. https://www.enisa.europa.eu/publications/artificial-intelligence-cybersecurity-challenges. Accessed 11 Oct 2021

  • European Commission (2013) Joint communication to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions, cybersecurity strategy of the European Union: an open, safe and secure cyberspace. COM Join/2013/01/final

    Google Scholar 

  • European Commission (2018) Communication on the strategy for artificial intelligence in Europe. Brussels. COM 2018:237

    Google Scholar 

  • European Commission (2020a) White paper on AI. Brussels. COM 2020:65

    Google Scholar 

  • European Commission (2020b) Proposal for a Regulation of the European Parliament and of the Council on European data governance (Data Governance Act). Brussels. COM/2020/767 final

    Google Scholar 

  • European Commission (2020c) Proposal for a regulation of the European Parliament and of the council on a single market For digital services (digital services act) and amending directive 2000/31/EC. Brussels. COM/2020/825 final

    Google Scholar 

  • European Commission (2020d) Proposal for a Regulation of the European Parliament and of the council on contestable and fair markets in the digital sector (digital markets act). Brussels. COM 2020:842 (final)

    Google Scholar 

  • Glauner P (2022) An assessment of the AI regulation proposed by the European commission. In: Ehsani S et al (ed) The future circle of healthcare: aI, 3D printing, longevity, ethics, and uncertainty mitigation. Springer, Cham (Forthcoming)

    Google Scholar 

  • Greenleaf G (2021) The ‘Brussels effect’ of the EU’s ‘AI Act’ on data privacy outside Europe. 171 Privacy laws & business international report, vol 1, pp 3–7

    Google Scholar 

  • Hornung G, Bauer S (2019) Privacy through certification?: The new certification scheme of the general data protection regulation. In: Rott P (ed) Certification—trust, accountability, liability. Studies in European economic law and regulation, vol 16. Springer, Cham, pp 109–131

    Chapter  Google Scholar 

  • Kamara I (2020) Misaligned Union laws? A comparative analysis of certification in the cybersecurity Act and the general data protection regulation, TILT Law & Technology Working Paper No. 002/2020. https://ssrn.com/abstract=3732846. Accessed 11 Oct 2021

  • Lachaud E (2019) What could be the contribution of certification to data protection regulation? Dissertation, Tilburg University

  • Lachaud E (2018) The general data protection regulation and the rise of certification as a regulatory instrument. Comput Law Secur Rev 34(2):244–256

    Article  Google Scholar 

  • Rodrigues R, Barnard-Wills D, De Hert P, Papakonstantinou V (2016) The future of privacy certification in Europe: an exploration of options under article 42 of the GDPR. Int Rev Law Comput Technol 30(3):248–270

    Article  Google Scholar 

  • Taddeo M, McCutcheon T, Floridi L (2019) Trusting artificial intelligence in cybersecurity is a double-edged sword. Nat Mach Intell 1:557–560

    Article  Google Scholar 

  • Veale M, Zuiderveen Borgesius F (2021) Demystifying the draft EU artificial intelligence act (July 31, 2021). Computer Law Review International (2021) 22(4). https://ssrn.com/abstract=3896852. Accessed 11 Oct 2021

  • Viscusi K (1978) A note on “Lemons” markets with quality certification. Bell J Econ 9:277

    Article  Google Scholar 

  • Weber R, Studer E (2016) Cybersecurity in the Internet of things: legal aspects. Comput Law Secur Rev 32(5):715–728

    Article  Google Scholar 

European legislation

  • Directive 2009/48/EC of the European Parliament and of the Council of 18 June 2009 on the safety of toys, OJ L 170, 30.06.2009, p. 1–37

  • Regulation (EC) No 765/2008 of the European Parliament and of the Council of 9 July 2008 setting out the requirements for accreditation and market surveillance relating to the marketing of products and repealing Regulation (EEC) No 339/93, OJ L 218, 13.08.2008, p. 30–47

  • Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation), OJ L 119, 04.05.2016, p. 1–88

  • Regulation (EU) 2017/821 of the European Parliament and of the Council of 17 May 2017 laying down supply chain due diligence obligations for Union importers of tin, tantalum and tungsten, their ores, and gold originating from conflict-affected and high-risk areas, OJ L 130, 19.05.2017, p. 1–20

  • Regulation (EU) 2018/848 of the European Parliament and of the Council of 30 May 2018on organic production and labelling of organic products and repealing Council Regulation (EC) No 834/2007

  • Regulation (EU) 2019/881 of the European Parliament and of the Council of 17 April 2019on ENISA (the European Union Agency for Cybersecurity) and on information and communications technology cybersecurity certification and repealing Regulation (EU) No 526/2013 (Cybersecurity Act). OJ L 151, 07.06.2019, p 1

  • Regulation (EU) No 1025/2012 of the European Parliament and of the Council of 25 October 2012on European standardisation, (OJ L 316, 14.11.2012, p. 12)

Download references

Acknowledgements

This research was supported by the ERDF project “CyberSecurity, CyberCrime and Critical Information Infrastructures Center of Excellence” (No. CZ.02.1.01/0.0/0.0/16_019/0000822) in the framework of the visiting fellowship at Mazaryk University. The author would like to thank Dianora Poletti, Francesca Fanucci and Radim Polcak for their comments and suggestions in earlier drafts of the paper. The usual disclaimer applies.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Federica Casarosa.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Casarosa, F. Cybersecurity certification of Artificial Intelligence: a missed opportunity to coordinate between the Artificial Intelligence Act and the Cybersecurity Act. Int. Cybersecur. Law Rev. 3, 115–130 (2022). https://doi.org/10.1365/s43439-021-00043-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1365/s43439-021-00043-6

Keywords

Navigation