Skip to main content

Artificial Intelligence and the Financial Markets: Business as Usual?

  • Chapter
  • First Online:
Regulating Artificial Intelligence

Abstract

AI and financial markets go well together. The promise of speedy calculations, massive data processing and accurate predictions are too tempting to pass up for an industry in which almost all actors proceed exclusively instructed by a profit maximising logic. Hence, the strong mathematical prerequisites of financial decision-making give rise to the question: Why do financial markets require a human element anyway? The question is largely of a rhetorical nature due to the lack of complexity of most current AI tools. However, AI tools have been used in finance since the early 1990s and the push to overcome faulty computing and other shortcomings has been palpable ever since. Digitalization has amplified efforts and possibilities. Institutions with business models based on AI are entering the market by the hundreds; banks and insurers are either spinning off their AI expertise to foster its growth or paying billions to acquire expertise. There is no way around AI—at least in certain parts of the financial markets. This article outlines the developments concerning the application of AI in the financial markets and discusses the difficulties pertaining to its sudden rise. It illustrates the diverse fields of application (Sect. 1) and delineates approaches, which major financial regulators are taking towards AI (Sect. 2). In a next step governance through and of AI is discussed (Sect. 3). The article concludes with the main problems that a reluctant approach towards AI results in (Sect. 4).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 119.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 159.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 159.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    BaFin (2018), p. 19.

  2. 2.

    WEF (2018).

  3. 3.

    FSB (2017a), p. 1.

  4. 4.

    Edwards, pp. 97 et seq.: the most common being advice algorithms that compute not only market data but also the personal information of users in order to give guidance to consumers when they engage in business with a firm.

  5. 5.

    Lightbourne (2017), pp. 652 et seq.

  6. 6.

    KPMG (2016), p. 3.

  7. 7.

    Robo-advisers are not necessarily chatbots; on their utilization see Hennemann, para 12.

  8. 8.

    Chiu (2016), p. 88.

  9. 9.

    Bradley (2018), p. 74.

  10. 10.

    Chiu (2016), p. 89.

  11. 11.

    Lightbourne (2017), pp. 663 et seq.

  12. 12.

    On the potential influence of flawed data aggregation see Tischbirek, para 7.

  13. 13.

    Bruckner (2018), p. 13.

  14. 14.

    Odinet (2018), p. 785. About the implications of such algorithm decisions on a persons autonomy see Ernst, paras 4–5.

  15. 15.

    Odinet (2018), pp. 783, 800 et seq.

  16. 16.

    Odinet (2018), pp. 829 et seq.

  17. 17.

    Portfolio management however still is a central function of insurers.

  18. 18.

    WEF (2018), p. 111.

  19. 19.

    E.g. Balasubramanian et al. (2018).

  20. 20.

    Ling (2014), p. 568.

  21. 21.

    Yadav (2015), pp. 1618 et seq.

  22. 22.

    Narang (2013), pp. 8 et seq.

  23. 23.

    Ling (2014), p. 572.

  24. 24.

    BaFin (2018), p. 11.

  25. 25.

    Seddon and Currie (2017), p. 300.

  26. 26.

    Osipovich (2017).

  27. 27.

    Seddon and Currie (2017), p. 305.

  28. 28.

    Kaastra and Boyd (1996), p. 234.

  29. 29.

    Trippi and DeSieno (1992), pp. 27 et seq.

  30. 30.

    Arévalo et al. (2016), p. 424.

  31. 31.

    Schmidhuber (2015), p. 85.

  32. 32.

    FSB (2017a), p. 18.

  33. 33.

    FSB (2017a), p. 19.

  34. 34.

    FSB (2017b), p. 3.

  35. 35.

    BCBS (2011), pp. 12 et seq. For a short summary of its legal status Schemmel (2016), pp. 460 et seq.

  36. 36.

    For a discussion of the relation between risk-weighted bank capital and the minimum leverage ratio that is not risk weighted and was introduced by Basel III as a backstop to risk-based capital requirements see Gambacorta and Karmakar (2016), pp. 3 et seq.

  37. 37.

    Angelini et al. (2008) and Danielsson et al. (2017).

  38. 38.

    FSB (2017a), p. 16.

  39. 39.

    Anagnostopoulos (2018).

  40. 40.

    31 CFR 103.121 (USA). Article 8 Directive 2005/60/EC (European Union)—sometimes referred to as ‘customer due diligence’.

  41. 41.

    Craig (2018).

  42. 42.

    Aziz and Dowling (2018), p. 10. On other possible applications Neufang (2017).

  43. 43.

    Milne (2018).

  44. 44.

    Nordea (2018). On the competition implications of data power see Hennemann, paras 20 et seq.

  45. 45.

    Zetsche et al. (2018), p. 410.

  46. 46.

    For further discussion of the following see Chiu (2016), pp. 71 et seq.

  47. 47.

    Chiu (2016), pp. 83 et seq.

  48. 48.

    BaFin (2018), pp. 65 et seq.

  49. 49.

    Lin (2015a), p. 655.

  50. 50.

    Lin (2016), pp. 168 et seq.

  51. 51.

    FSB (2017a).

  52. 52.

    On the following see FSB (2017a), pp. 24 et seq.

  53. 53.

    Cf. see paras 6 et seq. and 2 et seq.

  54. 54.

    Cf. see paras 10 et seq.

  55. 55.

    FSB (2017a), p. 1.

  56. 56.

    On so-called ‘explainable AI’ see Wischmeyer, paras 27 et seq., Rademacher, para 33.

  57. 57.

    FSB (2017a), p. 2.

  58. 58.

    Joint Committee (2018).

  59. 59.

    Joint Committee (2018), p. 22.

  60. 60.

    European Commission (2018).

  61. 61.

    EBA (2018).

  62. 62.

    Directive 2014/65/EU. Regulation Nr. 600/2014.

  63. 63.

    See on the following Čuk and Waeyenberge (2018).

  64. 64.

    Article 17 Directive 2014/65/EU.

  65. 65.

    Article 26 para. 3 Regulation Nr. 600/2014.

  66. 66.

    Article 48 Directive 2014/65/EU.

  67. 67.

    Article 17 para 2 subpara 5.

  68. 68.

    Article 1 lit. c Executive Order 13772 on Core Principles for Regulating the United States Financial System.

  69. 69.

    US Treasury (2018).

  70. 70.

    US Treasury (2018), pp. 56 et seq.

  71. 71.

    US Treasury (2018), p. 59.

  72. 72.

    FINRA (2018), pp. 6 et seq.

  73. 73.

    17 CFR 240.15b9-1. On the political background see Bain (2018).

  74. 74.

    On the most recent proposal Morelli (2017), pp. 220 et seq. For a summary of the events leading to the current system Poirier (2012).

  75. 75.

    BaFin (2018).

  76. 76.

    Cf. see para 21.

  77. 77.

    This requirement was established to transpose the almost equally worded Article 17(2) 1 Directive 2014/65/EU. The registration duty exists to enable supervisory authorities to make use of their auditing powers under Article 17(2) 2 Directive 2014/65/EU; § 4(1) WpHG.

  78. 78.

    Even though the European Securities and Markets Authority (ESMA) has already published guidelines further specifying some of the requirements: Guideline 2 ESMA/2012/122 (EN). On guidelines and their (quasi-)legal effect Schemmel (2016), pp. 459 et seq.

  79. 79.

    This borders on a principle based approach, see Schemmel (2016), pp. 487 et seq.

  80. 80.

    BaFin (2018), p. 3.

  81. 81.

    Bundesregierung (2018). For a comparison see House of Lords (2017).

  82. 82.

    On the following FCA (2018a).

  83. 83.

    FCA (2017a), pp. 4 et seq. Arner et al. (2017), p. 371: A better term might therefore be ‘clinical trial’.

  84. 84.

    On the concept of umbrella sandbox Zetsche et al. (2017), pp. 85 et seq.

  85. 85.

    FCA (2017a), pp. 6 et seq.

  86. 86.

    E.g. Australia, Singapore, Switzerland, Hong Kong, Thailand, Abu Dhabi and Malaysia. On this with further references Arner et al. (2017), p. 371.

  87. 87.

    Thomas (2018); rather hesitant Peirce (2018).

  88. 88.

    FCA (2018b).

  89. 89.

    On challenges of digitalisation Lin (2017), pp. 1253 et seq.

  90. 90.

    On certain specifics of AI based agency supervision see Hermstrüwer, paras 20 et seq.

  91. 91.

    Broeders and Prenio (2018), p. 10.

  92. 92.

    Bauguess (2017).

  93. 93.

    Bauguess (2017); for an account of constitutional frameworks and major issues concerning AI and law enforcement in general see Rademacher, paras 13 et seq.

  94. 94.

    FCA (2017b).

  95. 95.

    FINRA (2018), pp. 8 et seq. (machine-readable rulebook).

  96. 96.

    On the following Baxter (2016), pp. 589 et seq.; Guihot et al. (2017), pp. 436 et seq. For a discussion of innovative regulation with regard to legal tech see Buchholtz, paras 32 et seq.

  97. 97.

    Baxter (2016), p. 603.

  98. 98.

    On financial service regulation as path-breaker Fenwick et al. (2017).

  99. 99.

    A central AI-Agency seems therefore impractical, suggesting this Scherer (2016), pp. 395 et seq.

  100. 100.

    For a discussion of liability attribution under competition law see Hennemann, paras 31 et seq.

  101. 101.

    On the enforcement policies of the CTFC Scopino (2015), pp. 279 et seq.

  102. 102.

    On agency through explanation see also Wischmeyer, paras 25 et seq., and see Rademacher, para 33.

  103. 103.

    See also Wall (2018). Cf. Hermstrüwer, paras 65–69 for proposals on how AI might be used to improve administrative procedures in dealing with regulatory complexity.

  104. 104.

    On additional challenges Packin (2018), pp. 211 et seq.

  105. 105.

    On the normative constraints of information regulation see Wischmeyer, paras 16 et seq.

  106. 106.

    See also Wischmeyer, paras 22 in favor of a tailor made approach.

  107. 107.

    Lin (2015b), pp. 508 et seq.

  108. 108.

    See para 18.

  109. 109.

    On the question as to whether consumers should also be informed about non-personalized results see Ernst, para 48.

  110. 110.

    See para 18, esp. note 53.

  111. 111.

    See the detailed account Tischbirek, paras 3 et seq.

  112. 112.

    On the General Data Protection Regulation and AI see Marsch, paras 7–9.

  113. 113.

    On this Guihot et al. (2017), pp. 431 et seq. On the responsibility of public actors and possible types of regulation see Hoffmann-Riem, paras 21 et seq., 58 et seq.

  114. 114.

    Lee et al. (2018).

  115. 115.

    On the perspective of existing idealized economic models Parkes and Wellman (2015) (discussing the ‘homo economicus’).

  116. 116.

    Mutatis mutandis, i.e. discussing the promises of ‘perfect law enforcement’ via AI, see Rademacher, paras 39–42.

  117. 117.

    On the importance of system protection see Hoffmann-Riem, paras 29 et seq.

  118. 118.

    See report of the German Parliament Deutscher Bundestag (2009), p. 91 and passim.

References

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jakob Schemmel .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Schemmel, J. (2020). Artificial Intelligence and the Financial Markets: Business as Usual?. In: Wischmeyer, T., Rademacher, T. (eds) Regulating Artificial Intelligence. Springer, Cham. https://doi.org/10.1007/978-3-030-32361-5_11

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-32361-5_11

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-32360-8

  • Online ISBN: 978-3-030-32361-5

  • eBook Packages: Law and CriminologyLaw and Criminology (R0)

Publish with us

Policies and ethics