Skip to main content

Quis Custodiet IP-sos Algorithmos?

Competition-Related Data Handling Obligations and Algo Auditing Under the DMA

  • Chapter
  • First Online:
Kreation Innovation Märkte - Creation Innovation Markets
  • 313 Accesses

Abstract

Compliance obligations imposed by the Digital Markets Act and general competition law frequently require the employment of algorithmic software, especially where they address the handling of data. Such algorithmic software must, itself, be controlled for compliance through audits. Auditing algorithms will, hence, become a major part of future law enforcement. This is true not only for the DMA and general competition law, but also for many other areas, such as AI risk regulation or the responsibilities of digital service providers. Through comparative benchmarking across legislative areas and systematical interpretation, this contribution tries to identify key features of an appropriate algo auditing regime for competition-related obligations and discusses how they could be established, in particular, under the DMA. Such features include lifecycle stages at which algorithms should be audited, the appropriate legal bases for the audits, methodologies and agents as well as compliance assessment parameters and the possibility for safe harbors. The contribution also briefly highlights that IP and data protection rules may require modification to allow for appropriate algo auditing.

Prof. Dr. Peter Georg Picht, LL.M. (Yale), Chair for Economic, Competition and Intellectual Property Law, Chairman Center for Intellectual Property and Competition Law (CIPCO), University of Zurich.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 189.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 249.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    See, for instance, Klein (2021), pp. 539 et seq.; Abada/Lambin (2022), pp. 13 et seq.; Calvano et al. (2020), pp. 3268 et seq.; Harrington (2022), p. 6889; Assad et al. (2020), pp. 15 et seq.

  2. 2.

    See, for instance, Ezrachi/Stucke (2016); Picht/Loderer (2019), p. 391; Bernhardt/Dewenter (2020), p. 312; Thomas (2021), p. 293.

  3. 3.

    Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act), OJ L 277/1, 27 October 2022.

  4. 4.

    Proposal for a Regulation of the European Parliament and of the Council Laying Down Harmonised Rules on Artificial Intelligence (Artificial Intelligence Act) and Amending Certain Union Legislative Acts, SEC(2021) 167 final.

  5. 5.

    On the requirements for Gatekeeper status, see Art. 3 DMA.

  6. 6.

    Basedow (2021), p. 225.

  7. 7.

    Cf. Opinion of Advocate General Rantos, C-252/21, 20.9.2022, Meta (formerly Facebook)/Bundeskartellamt, ECLI:EU:C:2022:704.

  8. 8.

    GC, T-612/17, 27.6.2017, Google and Alphabet / Commission (Google Shopping), ECLI:EU:T:2021:763.

  9. 9.

    See, for instance, Graef (2018), p. 541.

  10. 10.

    Cheng/Nowag (2021), pp. 9–10.

  11. 11.

    On the integration of ChatGPT into the Microsoft ecosystem, see Watson Blog (2023) Microsoft Office auf Steroiden – so wird die Büro-Software mit KI aufgemotzt. https://www.watson.ch/digital/k%C3%BCnstliche%20intelligenz/548350294-chatgpt-so-ruestet-microsoft-seine-office-apps-mit-ki-funktionen-auf. Accessed 10 June 2023.

  12. 12.

    On the use of algos for implementing data-related conduct obligations, see also Recital (28) DSA Delegated Audit Regulation.

  13. 13.

    See, for instance, Art. 9 draft AI Act on risk management systems.

  14. 14.

    See, for instance, Art. 23 et seq. DSA.

  15. 15.

    Art. 22 GDPR; Lepri et al. (2018), p. 611.

  16. 16.

    The references to Art. 20 DMA-proceedings and to violations of Art. 5–7, 13 DMA in Art. 8(2) DMA show that the “implementing acts” are rather case-specific than intended to lay out a general, abstract grid of compliance measures.

  17. 17.

    The Commission’s issuance of a delegated act is subject to the requirements in Art. 49 DMA, including the consultation with Member State-designated experts and the lack of objections declared by either the European Parliament or the Council.

  18. 18.

    Digital Regulation Cooperation Forum, Auditing algorithms: the existing landscape, role of regulators and future outlook, 23 September 2022 (hereinafter: DRCF (2022) Auditing Algorithms), 33.

  19. 19.

    Recital (5) DSA Delegated Audit Regulation.

  20. 20.

    Cf. also the DSA Delegated Audit Regulation, already taking this approach for the DSA.

  21. 21.

    On commitments as an important source of audit matters, see also Recital (16), Art. 2(4), Art. 17(1)(b) DSA Delegated Audit Regulation.

  22. 22.

    On the following, see DRCF (2022) Auditing Algorithms, 17 et seq.; Lomas (2023).

  23. 23.

    Cf. Sec. 5.3 Annex VII to the Proposal for a Regulation of the European Parliament and of the Council Laying Down Harmonised Rules on Artificial Intelligence (Artificial Intelligence Act) and Amending Certain Union Legislative Acts, SEC(2021) 167 final.

  24. 24.

    Commission delegated regulation laying down rules on the performance of audits for very large online platforms and very large online search engines, Ares(2023)3171302 – 05/05/2023.

  25. 25.

    Cf., for instance, Recitals (13), (25) DSA Delegated Audit Regulation arguably considering (in a nutshell) all three main techniques, Recital (29), (30) DSA Delegated Audit Regulation with a focus on governance and empirical audits, Art. 10(5)(c) DSA Delegated Audit Regulation on empirical audits.

  26. 26.

    Cf., for the DSA context, the DSA Delegated Audit Regulation and its pertinent statements (see above, fn. 24).

  27. 27.

    On the assessment of data samples as an appropriate method, but also on the importance of composing a sample that correctly reflects the algo performance, see Recital (33), Art. 12 DSA Delegated Audit Regulation.

  28. 28.

    On the use of such techniques by competition authorities, see for instance CMA Digital Comparison Tools (DCT), Mystery Shopping Research, Technical Report 2017.

  29. 29.

    On legal uncertainties regarding scraping, see Leistner/Zurth (2021), § 49 UrgG Rn. 137 et seq. For the DSA, the Commission also considered it helpful to explicitly mention the permissibility of sandboxing, cf. Recital (30) DSA Delegated Audit Regulation.

  30. 30.

    Cf., for instance, Art. 5(2) DSA Delegated Audit Regulation.

  31. 31.

    See Art. 5(4) TEU, Art. 52(1) CFR; CJEU C-134/15, ECLI:EU:C:2016:498, para. 33 – Lidl/Freistaat Sachsen; Streinz (2018) GR-Charta 52 Rn. 16.

  32. 32.

    According to Art. 51 of the EU Charter of Fundamental Rights, the principle of proportionality in the restriction of fundamental rights, as set out in Art. 52 of the Charter, applies to all “institutions, bodies, offices and agencies of the Union”. In addition to the entities created by or based on the Treaties, private law entities controlled by the EU are also subject to fundamental rights obligations; Holoubek/Oswald (2019), GRC-Kommentar 51 para. 9. The same must apply to private actors to which the EU – or EU Member States – have delegated sovereign powers; Jarass (2021), GRCh 51 para. 15, 19, 37; Stern/Hamacher (2016), GRCh para. 103; Oesch (2019), para. 746; cf. also Breitenmoser/Weyeneth (2017), para. 1198 seqq. An indirect or direct binding effect on private individuals, on the other hand, is much less established in EU law, see CJEU, 6.11.2018, Rs. C-569/16 and 570/16 (Bauer), para. 87 seq.; Schlachter (2019), p. 53; Seyderhelm (2021), p. 155 seq.; Stern/Hamacher (2016) GRCh, para. 103; Jarass (2021), GRCh 51 para. 37 seqq. This may jeopardize auditees’ legitimate interests where auditors are, even though they perform tasks envisaged by the DMA, not considered to exercise state powers.

  33. 33.

    DRCF (2022) Auditing Algorithms, 15.

  34. 34.

    European Parliamentary Research Service, Auditing the quality of datasets used in algorithmic decision-making systems, July 2022, 35.

  35. 35.

    Cf., for instance, the IEEE standard P7003 on Algorithmic Bias Considerations, https://standards.ieee.org/ieee/7003/6980/; on this standard and, more generally, the IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems, see Koene/Dowthwaite/Seth (2018), pp. 38–41.

  36. 36.

    Meaning that the audit is carried out by an ECN member particularly well positioned to perform it.

  37. 37.

    Opinion of Advocate General Rantos, C-252/21, 20.9.2022, Meta (formerly Facebook) / Bundeskartellamt, ECLI:EU:C:2022:704, especially paras. 27 et seq. On the opinion, see Picht/Akeret (2023), GRUR (forthcoming).

  38. 38.

    The general algo audit discourse has also identified the coordination issues which can result from the incoherent involvement of multiple agents, see for instance DRCF (2022) Auditing Algorithms, 42 et seq. on issues resulting from a multitude of parallel regulatory sandboxes.

  39. 39.

    Cf. Recital (3) DSA Delegated Audit Regulation.

  40. 40.

    The need for specific expertise regarding different aspects of the audited matter may require the involvement of several specialized auditors, see Recital (3) DSA Delegated Audit Regulation.

  41. 41.

    Cf. Recital (9), Art. 4 DSA Delegated Audit Regulation.

  42. 42.

    DRCF (2022) Auditing Algorithms, 27, referring to stakeholder responses the authorities received.

  43. 43.

    See Recital (8) DSA Delegated Audit Regulation.

  44. 44.

    In particular, Art. 37(3) DSA permits auditors to provide to their auditees services unrelated to the audited matter. They can even, in the past or in the future, provide services related to the audited matter, as long as they respect a 12 months’ time gap between such services and the audit. EU financial auditing rules, for instance, entirely prohibit the provision of certain important services during the auditing period, even where the services do not relate to the audited matter, see Art. 5(1)(a) Regulation (EU) No 537/2014 on specific requirements regarding statutory audit of public-interest entities, OJ L 158/77, 27.5.2014.

  45. 45.

    Cf., for instance, Facebook’s Fairness Flow, https://ai.facebook.com/blog/how-were-using-fairness-flow-to-help-build-ai-that-works-better-for-everyone; IBM’s AI 360 Toolkit, https://developer.ibm.com/articles/instilling-trust-in-ai; Google’s Model Reporting Cards, https://ai.googleblog.com/2020/07/introducing-model-card-toolkit-for.html. All sources accessed 10 June 2023.

  46. 46.

    Cf. also Art. 5(2) draft Data Act which prohibits Gatekeepers from acting as third-party recipient of data under that Act.

  47. 47.

    European Parliamentary Research Service, Auditing the quality of datasets used in algorithmic decision-making systems, July 2022, 35.

  48. 48.

    Karbaum/Schulz (2022), p. 107; Basedow (2021), p. 217; Richter/Gömann (2023), p. 208.

  49. 49.

    E.g. Karbaum/Schulz (2022), p. 109; Richter/Gömann (2023), pp. 209 et seq.; for a critical view, see Zimmer/Göhsl (2021), p. 52.

  50. 50.

    Generally on them in the context of algo audits DRCF (2022) Auditing Algorithms, 29 et seq.

  51. 51.

    DRCF (2022) Auditing Algorithms, 28.

  52. 52.

    Recital (2) DSA Delegated Audit Regulation.

  53. 53.

    Kim (2017), pp. 191 et seq. with further references; Explanatory Memorandum to the DSA Delegated Audit Regulation, p. 3.

  54. 54.

    See Art. 2(13), Art. 8(1)(a) DSA Delegated Audit Regulation.

  55. 55.

    In the DSA Delegated Audit Regulation terminology, such certification is expressed through audit conclusions and audit opinions, see Art. 8 DSA.

  56. 56.

    Cf. also the “materiality threshold” introduced in Art. 2(12), Art. 10(2)(a) DSA Delegated Audit Regulation which contributes to preventing that any deviation from optimal compliance results in a negative audit outcome.

  57. 57.

    On these requirements for algo design, see for instance ico/The Alan Turing Institute, Explaining decisions made with AI, 17 et seq.; Lomas (2023).

  58. 58.

    DRCF (2022) Auditing Algorithms, 27.

  59. 59.

    On a potential audit blocking effect of trade secrets-based defenses by the algo controller, see also DRCF (2022) Auditing Algorithms, 29.

  60. 60.

    Cf. Recital (13), Art. 5(2) DSA Delegated Audit Regulation.

References

  • Abada I, Lambin X (2022) Artificial intelligence: can seemingly collusive outcomes be avoided? https://doi.org/10.2139/ssrn.3559308

  • Assad S, Clark R, Ershov D, Xu L (2020) Algorithmic pricing and competition: empirical evidence from the German retail gasoline market. CESifo working paper no. 8521:15–73. https://www.cesifo.org/DocDL/cesifo1_wp8521.pdf. Accessed 10 June 2023

  • Basedow J (2021) Das rad neu erfunden: Zum Vorschlag für einen digital markets act. Zeitschrift für Europäisches Privatrecht:217–226

    Google Scholar 

  • Bernhardt L, Dewenter R (2020) Collusion by code or algorithmic collusion? When pricing algorithms take over. Eur Comp J 16:312–342. https://doi.org/10.1080/17441056.2020.1733344

  • Breitenmoser S, Weyeneth R (2017) Europarecht. Unter Einbezug des Verhältnisses Schweiz–EU, 4. Aufl. Dike, Zurich

    Google Scholar 

  • Calvano E, Calzolari G, Denicolò V, Pastorello S (2020) Artificial intelligence, algorithmic pricing, and collusion. Am Econ Rev 110(10):3267–3297. https://doi.org/10.1257/aer.20190623

  • Cheng TK, Nowag J (2021) Algorithmic predation and exclusion. LundLawCompWP 1/2022, University of Hong Kong faculty of law research paper no. 2022/05. https://doi.org/10.2139/ssrn.4003309

  • Ezrachi A, Stucke M (2016) Virtual competition. J Eur Comp Law Practice 7(9):585–586. https://doi.org/10.1093/jeclap/lpw083

  • Graef I (2018) Algorithms and fairness: what role for competition law in targeting price discrimination towards end consumers. Columbia J Eur Law 24(3):541–559. https://doi.org/10.2139/ssrn.3090360

  • Harrington JE (2022) The effect of outsourcing pricing algorithms on market competition. Manag Sci 68(9):6889–6906. https://doi.org/10.1287/mnsc.2021.4241

  • Holoubek M, Oswald M (2019) GRC 51. In: Holoubek M, Lienbacher G (Hrsg) GRC-Kommentar Charta der Grundrechte der Europäischen Union, 2. Aufl. Manz, Vienna

    Google Scholar 

  • Jarass HD (2021) Charta der Grundrechte der Europäischen Union: GRCh, 4. Aufl. C.H. Beck, Munich

    Google Scholar 

  • Karbaum C, Schulz M (2022) “Antitrust Litigation 2.0” – Private Enforcement beim DMA?. Neue Zeitschrift für Kartellrecht:107–112

    Google Scholar 

  • Kim P (2017) Auditing algorithms for discrimination. 166 U. Pa. L. Rev. Online 189, Washington University in St. Louis legal studies research paper no. 17-12-03:189–203

    Google Scholar 

  • Klein T (2021) Autonomous algorithmic collusion: Q-learning under sequential pricing. RAND J Econ 52(3):538–558. https://doi.org/10.1111/1756-2171.12383

  • Koene A, Dowthwaite L, Seth S (2018) IEEE P7003™ standard for algorithmic bias considerations: work in progress paper. FairWare ’18: proceedings of the international workshop on software fairness:38–41. https://doi.org/10.1145/3194770.3194773

  • Leistner M, Zurth P (2021) In: Loewenheim U (Hrsg) Handbuch des Urheberrechts, 3. Aufl. C.H. Beck, Munich. § 49 Rn. 137 et seq

    Google Scholar 

  • Lepri B, Oliver N, Letouzé E, Pentland A, Vinck P (2018) Fair, transparent, and accountable algorithmic decision-making processes. Philos Technol 31:611–627. https://doi.org/10.1007/s13347-017-0279-x

  • Lomas N (2023) Europe spins up AI research hub to apply accountability rules on Big Tech, 18 April 2023. https://techcrunch.com/2023/04/18/ecat/. Accessed 10 June 2023

  • Oesch M (2019) Europarecht, vol. I, 2. Aufl. Stämpfli, Berne

    Google Scholar 

  • Picht PG, Akeret C (2023) Back to stage one? – AG Rantos’ opinion in the meta (Facebook) case. Gewerblicher Rechtsschutz und Urheberrecht 2023 (forthcoming). https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4414591

  • Picht PG, Loderer G (2019) Framing algorithms: competition law and (other) regulatory tools. World Comp 42(3):391–418. https://doi.org/10.2139/ssrn.3275198

  • Richter A-C, Gömann M (2023) Private enforcement des DMA – Ein Ausblick am Beispiel Amazons. Neue Zeitschrift für Kartellrecht:208–213

    Google Scholar 

  • Schlachter M (2019) Drittwirkung von Grundrechten der EU-Grundrechtecharta. Zeitschrift für europäisches Sozial- und Arbeitsrecht:53–58. https://doi.org/10.37307/j.1868-7938.2019.02.04

  • Seyderhelm M (2021) Grundrechtsbindung Privater. Nomos, Baden-Baden

    Google Scholar 

  • Stern K, Hamacher A (2016) In: Stern K, Sachs M (Hrsg) Europäische Grundrechte-Charta: GRCh. C.H. Beck, Munich

    Google Scholar 

  • Streinz R (2018) GRC 52. In: Streinz R, Michl W (Hrsg) EUV/AEUV. Vertrag über die Europäische Union, Vertrag über die Arbeitsweise der Europäischen Union, Charta der Grundrechte der Europäischen Union, Beck’sche Kurzkommentare, Bd 57, 3. Aufl. C.H. Beck, Munich

    Google Scholar 

  • Thomas S (2021) Auslegung des Kartellverbots bei Kollusion durch Algorithmen. In: Zimmer (ed) Regulierung für Algorithmen und Künstliche Intelligenz, Baden-Baden, p 293–310. https://doi.org/10.5771/9783748927990

  • Zimmer D, Göhsl J-F (2021) Vom New Competition Tool zum Digital Markets Act: Die geplante EU-Regulierung für digitale Gatekeeper. Zeitschrift für Wettbewerbsrecht: 29 et seq

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Peter Georg Picht .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 Der/die Autor(en), exklusiv lizenziert an Springer-Verlag GmbH, DE, ein Teil von Springer Nature

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Picht, P.G. (2024). Quis Custodiet IP-sos Algorithmos?. In: Thouvenin, F., Peukert, A., Jaeger, T., Geiger, C. (eds) Kreation Innovation Märkte - Creation Innovation Markets. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-68599-0_64

Download citation

  • DOI: https://doi.org/10.1007/978-3-662-68599-0_64

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-662-68598-3

  • Online ISBN: 978-3-662-68599-0

  • eBook Packages: Social Science and Law (German Language)

Publish with us

Policies and ethics