Introduction

As the role of artificial intelligence (AI) expands in society, its associated risks and benefits are simultaneously increasing, leading to a growing interest in AI regulation. Data protection laws, along with other relevant legislations, have established a framework of rights and duties focused on the processing of personal data. However, as AI processes vast amounts of data in batches, it requires special rules, distinct from general data protection laws, in proportion to its increased risks (Wachter and Mittelstadt, 2019, pp. 497–498).

The General Data Protection Regulation (GDPR) of the European Union (EU) serves as a prominent international standard for data protection laws, and Article 22 of the regulation addresses the use of automated decision-making, including AI. Nonetheless, the interpretation and application of the provision have not been clearly defined. EU member states have not adopted a uniform interpretation of Article 22, and the Court of Justice of the European Union (CJEU) has yet to issue a direct ruling on it (Bygrave, 2020, p. 529). However, the Article 29 Working Party (WP29), the predecessor to the European Data Protection Board (EDPB), published guidelines for Article 22 (WP29, 2017, pp. 19–20), as mandated under Article 70(1)(f) of the GDPR. Although the guidelines of the WP29 are not legally binding, they have substantial de facto influence and authority.

The Personal Information Protection Act (PIPA) of the Republic of Korea (Korea) was influenced by Article 22 of the GDPR, leading to the establishment of Article 37-2 on March 14, 2023, with similar content. The details of the Article are presented in Table 1.

Table 1 Article 37-2 of the PIPA.

Similar to the GDPR, the PIPA serves as a comprehensive data protection law in Korea. However, several modifications have been introduced to the PIPA to allow for interpretations different from the GDPR. These modifications reflect the distinct nature of the PIPA in comparison to the GDPR. The PIPA is a national law governing a single country, whereas the GDPR governs EU member states.

To ensure consistency in interpretation and application, the EU has provided guidelines through the Recitals of the GDPR and several official documents of the WP29 and EDPB. The Recitals serve as references for the judiciary’s interpretation of its provisions (Case 215/88 Casa Fleischhandels (1989) European Court of Justice ECR 2789, paragraph 31), whereas the documents of the WP29 and EDPB are widely recognised as highly authoritative sources.

However, in Korea, laws enacted by the National Assembly and its subordinate legislation serve as the primary legal foundation for interpretation and application. While certain details of the laws may be delegated through subordinate legislation, comprehensive delegations that undermine the hierarchy of the legal system are prohibited (Constitutional Court of Korea, 91Hun-Ka4, Jul 8, 1991). Although Korean administrative agencies have issued guidelines specifying the meaning of numerous laws, their influence on the judiciary remains uncertain.Footnote 1

In the same vein, Articles 44-2, 44-3, and 44-4 of the Enforcement Decree present the details regarding Article 37-2 of the PIPA. In Korean laws, enforcement decrees often play a key role; therefore, when interpreting the PIPA’s provisions, it is essential to consider its enforcement decree and examine whether it holds sufficient validity. The details of the Enforcement Decree are presented in Table 2.

Table 2 Provisions of the Enforcement Decree of the PIPA related to Article 37-2.

However, in determining the meaning of Article 37-2 of the PIPA, the influence of the GDPR, which serves as an international standard for data protection laws, cannot be disregarded (Park and Kim, 2022, p. 367; Lee, 2024b, p. 29). The GDPR regulates substantial fines for unlawful processing of personal data affecting the citizens of the EU member states and authorises cross-border transfers of personal data, subject to EU adequacy decisions. As a result, establishing a GDPR-compliant data protection regime, even for non-EU member states such as Korea, offers practical benefits (Park, 2023b, pp. 182–184; European Commission, 2021, pp. 1–2).

In the following sections, we examine the differences between Article 22 of the GDPR and Article 37-2 of the PIPA based on three aspects—format, target, and content. Through this analysis, we assess whether the PIPA provides a sufficient level of data protection, equivalent to the GDPR, and propose potential legislation and interpretation remedies to address the associated challenges. Before delving into this, we present the overall structure and revision history of the PIPA to help you better understand it.

Summary of the PIPA amendments

General introduction to the PIPA

Personal information protection legislation in Korea has undergone significant evolution over time. Initially, only the processing of personal information by public institutions was regulated by the Act on the Protection of Personal Information of Public Institutions, which was established in 1994. Subsequently, additional laws were enacted to safeguard personal information in areas, such as credit, information and communication, and other private domains. However, after a series of personal information leaks by private companies, there was a call to expand existing personal information protection legislation (Jung, 2011, p. 408). Therefore, the PIPA, which serves as a comprehensive law for personal information protection in Korea, was enacted in 2011.

The enactment of the PIPA is closely related to the emergence of the concept of the right to informational self-determination in Korea (Kwon, 2016, pp. 674–677). The Constitution of Korea explicitly stipulates the protection of privacy and does not provide for the right to control personal information (Article 17). However, in 2005, the Constitutional Court of Korea officially recognised the right to informational self-determination as a constitutional right, grounded in the general right to personality, privacy, and other fundamental rights. According to the Court, the right to informational self-determination grants data subjects the authority to control their personal information, including deciding when, to whom, and to what extent it is to be disclosed and used (Constitutional Court of Korea, 99Hun-Ma513, May 26, 2005). This decision expanded the scope of rights related to personal information in Korea, moving beyond the passive concept of negative liberty to embrace the active concept of positive liberty.

The ideology of the right to informational self-determination is reflected in the purpose provision of the PIPA. Article 1 of the original version of PIPA aimed to ‘protect the rights and interests of all citizens and further realise the dignity and value of each individual by protecting personal privacy, etc. from collection, leakage, misuse and abuse of individual information’ and mainly focused on privacy. However, Article 1 of the current PIPA states its purpose as protecting ‘the freedom and rights of individuals’, which refers to the right to informational self-determination, empowering data subjects to directly control the collection and use of personal information.

The PIPA establishes a comprehensive framework to uphold the right to informational self-determination based on the ‘notice-and-consent’ paradigm (Park, 2023a, p. 310). Notice-and-consent is a globally recognised legal principle aimed at ensuring the substantial guarantee of the right to informational self-determination (Cate, 2010, pp. 1768–1769; Cate and Mayer-Schönberger, 2013, p. 67; Mali, 2021, pp. 142–143). According to this principle, data subjects must be adequately provided with relevant information and have the autonomy to exercise control over their personal information through ‘informed consent’. To achieve this, Articles 4(1) and (2) grant data subjects the right to be informed of the processing and the right to determine whether or not to consent and the scope of consent to the processing, respectively. In addition, Articles 15 to 22 outline measures to ensure that informed consent is effectively operational at each stage of personal information processing, including data collection, use, and provision. Despite several amendments to provide substantial support to these provisions since the establishment of the PIPA, its core objectives and framework have been maintained.

The PIPA’s content underwent a major revision in 2020, incorporating elements of the GDPR, which came into effect in 2018, aligning it more closely with international standards (In, 2018, pp. 1–3; Lee, 2020, pp. 440–444; Kim et al., 2021, pp. 50–53). Concerning the legislative structure, the provisions on personal information protection, previously fragmented across multiple laws known as the ‘three data-related laws’—the PIPA, the Act on Promotion of Information and Communications Network Utilisation and Information Protection, and the Credit Information Use and Protection Act (CIA)—were consolidated into the PIPA. Regarding the scope of application, the PIPA established clearer criteria for assessing the combinability of the so-called ‘personally identifiable information’ through the ‘likelihood that the other information can be procured’ (Article 2(1)(b)). As for the content, the amendment introduced special provisions for the processing of pseudonymised information for statistical, scientific research, and archiving purposes in the public interest, similar to Article 89 of the GDPR. Therefore, the revised PIPA in 2020 provided a legal basis for the active utilisation of big data in activities, such as AI research and development.

Meanwhile, through the 2020 revision, a provision equivalent to Article 22 of the GDPR on automated decision-making was added to Article 36-2 of the CIA (Lee, 2020, p. 448; Park, 2021, pp. 45–47). This provision aims to ensure active rights in response to an ‘automated evaluation’ in the finance sector by establishing the right to request an explanation, submit information deemed advantageous, and request correction, deletion, or re-calculation. It is noteworthy that Article 36-2(1) of the CIA legislated the right to request an explanation before the PIPA and specified the content that must be included in the explanation in the law. Although Article 31-2 of the Enforcement Decree of the CIA delineates the content of the law, Article 36-2(1) of the CIA specifies the details to be included in the Enforcement Decree.

Latest amendment to the PIPA

The most recent amendment to the PIPA was introduced to address deficiencies in the operation of the revised PIPA in 2020 (Ministry of Government Legislation of Korea, 2023). This amendment consolidated several bills since the 2020 revision, integrating and resolving a broad spectrum of issues. Alongside the amendment to Article 37-2, other major amendments include: (1) Establishing a basis for assessing and recommending improvements in the level of personal information protection (Article 11-2), (2) setting operational standards for mobile visual data processing devices such as drones and autonomous vehicles (Article 25-2), (3) providing a legal basis for the right to request personal data transfers to promote data economy (Article 35-2), and (4) implementing economic sanction-focused measures (Article 64-2).

Article 37-2 of the PIPA discussed in this paper was originally proposed in the government’s bill,Footnote 2 which became the main axis of the adopted amendment. According to government briefings (Ministry of Culture, Sports, and Tourism of Korea, 2023) and press releases (Personal Information Protection Commission of Korea, 2023b), this provision was introduced to strengthen the active rights of the citizens, enabling them to effectively exercise their rights and establish a foundation for trust in personal information processing among citizens, businesses, and institutions.

There are other provisions with similar objectives in the amendment, such as: 1) additional exceptions to the consent principle (Articles 15(1)(4), (5), and (7)), and 2) the establishment of the personal information dispute resolution system (Articles 43, 45, and 45-2). However, none of these provisions are related to automated decisions; hence, they will not be discussed in this paper. The only provision, except Article 37-2, that mentions automated decision is the newly added Article 4(6). It enumerates the rights of data subjects ‘to refuse to accept a decision made through a fully automated processing of personal information or to request an explanation thereof’, which overlaps with Article 37-2.

Thus, this paper focuses exclusively on Article 37-2 of the PIPA with its latest amendment. As there has been extensive discussion on the interpretation of Article 22 of the GDPR, this paper will not delve into the meaning of the provisions of the PIPA containing virtually identical phrasing.Footnote 3 Instead, this study addresses the issues and implications arising from the differences in wording and structure between the GDPR and PIPA regarding automated decisions.

Comparison of the GDPR and PIPA

Format: Granting the right to object or imposing a prohibition

Differences between the GDPR and PIPA

Regarding the regulatory format, Article 22 of the GDPR and Article 37-2 of the PIPA appear to follow a similar structure. Judging by the expression, both grant data subjects ‘the right’ to object to automated decision-making. However, according to the WP29, Article 22(1) should be construed as a ‘general prohibition’ on decision-making based solely on automated processing, regardless of its wording (WP29, 2017, p. 19). Although there are opinions interpreting Article 22(1) literally as granting data subjects the right to object to automated decision-making (Bygrave, 2020, pp. 531–532; Bygrave, 2021, pp. 96–98), the WP29’s interpretation is widely recognised as authoritative.

The WP29 presents two major grounds for considering Article 22(1) as a prohibition (WP29, 2017, pp. 34–35). First, it is unreasonable for data subjects to object and consent to the same processing, creating a contradiction in the withdrawal of consent under Article 22(2)(c). The WP29 believes that interpreting Article 22(1) as imposing a prohibition would effectively preserve the significance of the withdrawal of consent under Article 22(2)(c). Second, when data subjects object to automated decision-making, it may circumvent Article 22(3) as such an objection generally requires human intervention. If the exercise of the right to object to automated decision-making and the introduction of human intervention can be broadly regarded as equivalent, there is no reason to create an exception through Article 22(2)(a) and (c).

Article 37-2 of the PIPA takes a different view of the right to object to automated decision-making. Under the PIPA, decisions made by processing personal information through completely automated systems are permitted in principle, and data subjects can exercise their right to object if such decisions have a significant effect on their rights or duties. Article 37-2(1) specifies that completely automated systems encompass those involving AI applications, and Article 37-2(3) mandates that if the right to object is exercised, necessary measures must be implemented by personal information controllers. Although no authoritative judicial or administrative interpretations have been issued yet, in line with the formal structure of the provisions, it can be interpreted as granting a right, rather than imposing a prohibition (Park and Kim, 2022, pp. 371–373; Lee, 2024b, p. 37).

Assessment of the PIPA regulation

To evaluate the validity of the format of Article 37-2 of the PIPA, which establishes a right rather than a prohibition, it is essential to apply the same reasoning that the WP29 uses to interpret Article 22(1) of the GDPR as a prohibition. The proviso to Article 37-2(1) of the PIPA stipulates that the right to object to a decision made by a completely automated system does not arise when the decision is made per Articles 15(1)(1), (2), or (4). These articles correspond to the three exceptions outlined under Article 22(2) of the GDPR, namely: (c) if the decision is based on the data subject’s explicit consent; (b) if the decision is authorised by the EU or member state law; or (a) if the decision is necessary for entering into, or performance of, a contract, respectively. The difference is that while the GDPR ties the requirement for exceptions to the decision, the PIPA ties it to the data collection on which the decision is based.

Given these differences, the PIPA can avoid two critical arguments of the WP29. First, unlike the GDPR, consent under Article 15(1)(1) of the PIPA is for the personal information controller to ‘collect personal information and use it with the scope of the purpose of collection’, rather than for making subsequent decisions. Therefore, it is not contradictory even if a data subject, who has already consented to the collection and use of personal information within the scope of the purpose, objects to the subsequent automated decision-making.

Second, Article 37-2(3) of the PIPA stipulates that ‘the personal information controller shall not apply the automated decision unless there is a compelling reason not to do so, or shall take necessary measures, such as re-processing through human involvement and providing explanations’, if a data subject has exercised their rights under Articles 37-2(1) or (2). The PIPA offers increased flexibility in implementing non-human involvement by relaxing the requirement for ‘suitable’ measures, instead, it mandates ‘necessary’ measures, rendering the second criticism raised by the WP29 inapplicable. Both reasons support the argument that Article 37-2 should be interpreted, in line with its textual structure, as granting a right rather than imposing a prohibition.

As an exception, Article 37-2(1) of the PIPA does not apply in cases where there is consent of data subjects, potentially leading to a regulatory gap in safeguarding fundamental rights. There have been persistent criticisms regarding the notice-and-consent paradigm, citing difficulties in exercising the right to self-determination of future information due to a lack of information, upon which the PIPA is founded (Barocas and Nissenbaum, 2009, pp. 4–6; Sloan and Warner, 2013, pp. 15–21; Cate and Mayer-Schönberger, 2013, pp. 68–71). Furthermore, concerns have been raised in Korea that these challenges may lead to formalism in the PIPA’s notice-and-consent paradigm (Kwon, 2016, pp. 703–710). According to a recent statistic released by the Korean government, a broad spectrum of information is collected based on the prior consent of the data subjects, and 62.2% of the public do not review the notices in detail due to reasons such as difficulties in understanding (Personal Information Protection Commission of Korea, 2023a, pp. 49, 220–221). Therefore, if such formal consent becomes prevalent, the application of the PIPA will be limited when collecting personal information, regardless of its format.

However, format differences can pose problems with automated decisions where consent cannot be obtained. For instance, in Korea, the collection of personal information through web scraping or crawling is permitted with some exceptions (Supreme Court of Korea, Decision of 17 August 2016, 2014Da235080). In the case of messenger conversations, personal information can be collected with the consent of only one party in the conversation (Personal Information Protection Commission of Korea, Deliberation and Decision No. 2021-007-072, 28 April 2021). If automated decisions are made based on personal information collected in this manner, it is difficult to obtain direct consent from data subjects even at the time of the decision. In this case, variations in the legal structure of a right and a prohibition may be a significant factor affecting the protection of data subjects.

There are challenges regarding the substantial exercise of this right, particularly in the context of AI. The rapid and large-scale nature of AI makes it difficult for data subjects to object to automated decisions in a timely manner. In extreme cases where data subjects are unaware of the existence of automated decision-making or the details of such decisions, it can be difficult to expect the effective exercise of the right to object (Lee, 2024a, p. 317). Furthermore, the current structure of the PIPA places the burden of proof for exercising the right to object on the data subject, instead of the personal information controller (Park and Kim, 2022, p. 376; Lee, 2024b, p. 32). The resources required for personal information controllers to ensure the effective exercise of the right to object may also be a burden on society. Requiring personal information controllers within automated systems to employ human resources for this task would contradict the essence of automation. Taking these factors into account, it may be advantageous to establish a system of principled prohibitions, akin to the GDPR, rather than granting data subjects the right to object.

In general, prohibition-based approaches have the potential problem of imposing significant costs on personal information controllers and stifling innovation. However, Article 37-2(1) of the PIPA is significantly limited in the scope of application due to its strict requirements. The automated decisions are limited to (1) those where the level of automation is complete, (2) those that significantly affect a data subject’s rights or duties, and (3) those made based on personal information collected without the consent of the data subject. In addition, Article 37-2(3) introduces an exceptional provision stating that the right to object is not established if the personal information controller has a compelling reason not to do so. Therefore, the burden on the personal information controller may not be significant, even if a prohibition-based approach is taken. In conclusion, a prohibition-based approach may be a reasonable alternative in Korea when comparing the legal interests of data subjects and personal information controllers in limited situations where the regulations apply.

Target: Completely automated system or solely based on automated processing

Differences between the GDPR and PIPA

While the PIPA and GDPR broadly regulate similar targets, there are some differences in their specifics. Article 37-2(1) of the PIPA introduces a new concept called a ‘completely automated system’, similar to the target of Article 22(1) of the GDPR that regulates ‘decisions based solely on automated processing’, including profilingFootnote 4. There are differences between these provisions in that (1) the objects of regulation are ‘system’ and ‘decision’, and (2) ‘completely’ and ‘solely’ are used as automation modifiers. This distinction is important when determining whether a regulation applies to decision-making processes carried out through complex procedures.

The concept of an ‘automated system’ was initially introduced in the General Act on Public Administration (GAPA) of Korea and then incorporated into the PIPA. In the legislative process, Article 20 of the GAPA was inspired by Article 35a of the German Federal Administrative Procedure Act (Verwaltungsverfahrensgesetz, VwVfG; Ministry of Government Legislation of Korea, 2021a, p. 12; 2021b, p. 80). The details of Article 35a of the VwVfG and Article 20 of the GAPA are presented in Table 3.

Table 3 Comparing the provisions of the VwVfG and the GAPA.

However, Article 35a of the VwVfG does not use the term ‘system’; rather, it refers to the ‘fully automated issuing of an administrative act’ (Vollständig automatisierter Erlass eines Verwaltungsaktes). An administrative act is a specific type of decision, distinct from the system that made the decision. In contrast, Article 20 of the GAPA was amended to address the system at the time the provision was introduced, and Article 37-2 of the PIPA was enacted in a manner similar to the GAPA. Hence, Article 22 of the GDPR targets individual decisions, while Article 20 of the GAPA and Article 37-2 of the PIPA target systemic aspects.

The other difference concerns determining the extent of automation in processing, which has been a subject of contention regarding the interpretation of Article 22 of the GDPR. What does Article 22 of the GDPR mean by ‘based solely on automated processing’ and at what point is human intervention considered removed? If human intervention is merely a superficial procedure, does it fall within the purview of the provision? The WP29 characterises such circumventing behaviour as ‘fabricating human involvement’ and asserts that it cannot evade Article 22. According to the WP29, human involvement should entail ‘meaningful oversight’ and be ‘carried out by someone who has the authority and competence to change the decision’ to avoid being classified as a decision based solely on automated processing (WP29, 2017, p. 21).

The PIPA does not explicitly set forth the same constraints as the GDPR. Instead, the Personal Information Protection Commission (PIPC) of Korea, the government agency responsible for personal information protection policy in Korea, accepts the WP29’s interpretation. They take the position that decisions made by completely automated systems in the PIPA will be judged based on whether there is ‘substantial and meaningful human intervention’ (Personal Information Protection Commission of Korea, 2024, p. 24). Thus, there is not much difference between the GDPR and PIPA when it comes to human intervention issues regarding automation requirements.

However, there are differences between the two in terms of the conditions required to trigger these provisions. Article 37-2(1) of the PIPA requires that the decision ‘has a significant effect on his or her right or duty’ as a condition for the right to object. In contrast, Article 22(1) of the GDPR mandates the production of legal or similarly significant effects. Considering that the scope of Article 37-2(1) is limited to cases that have a significant effect on legal rights or duties, there is a critical view that its protective scope is narrowed (Lee, 2024b, pp. 37–38).

The PIPC has recently published an interpretation through official documents regarding this controversy (Personal Information Protection Commission of Korea, 2024, p. 27). According to the PIPC, the criteria to determine whether an automated decision ‘has a significant effect on his or her right or duty’ are: (1) whether it relates to the protection of a person’s life, bodily safety, and fundamental rights, (2) whether the rights of a data subject are deprived or the exercise of those rights becomes impossible, (3) whether a data subject incurs duties that are difficult to accept, (4) whether there are ongoing restrictions on the rights or duties of a data subject, and (5) whether there is a possibility to recover to the state before the relevant impact occurred or to avoid the impact. Hence, compared to the GDPR, the PIPA strictly limits the scope of decisions over which the right to object can be exercised.

Assessment of the PIPA regulation

Determining whether a system is completely automated and has a significant legal effect may seem like a simple issue. However, decision-making using AI involves a complex and multi-stage process, and there are various perspectives from which the entire decision-making system can be viewed, making it difficult to establish a single normative standard. The nature of the PIPA’s target scope can present practical challenges for evaluating complex AI systems comprised of multi-stage decision-making processes.

When examining complex decisions based on automated processing within such a multi-stage profiling system, it is difficult to determine the presence of substantial and meaningful human intervention in a triaging system. A representative example is an anomaly detection system that identifies high-risk groups from vast datasets and notifies human operators. One study highlights the complex nature of such a triaging system where primary triaging is executed through automated upstream processing and human intervention is limited to specific downstream branches. The study noted that in such cases, a particular decision could be assessed as being based solely on automated processing under the GDPR (Binns and Veale, 2021, pp. 322–323). However, the legal assessment of a multi-stage profiling system with a similar structure might differ under the PIPA.

Today, several automated systems for classifying data subjects to allocate limited resources take the form of triaging systems. A prime example is the automated hiring process. In Korea, 62% of large companies with 1000 or more employees are using AI recruitment systems, with 20% relying solely on AI during the document screening stage (Human Resources Development Service of Korea, 2023, pp. 15–29). AI recruitment systems are actively used in the early stages of recruitment, such as first-round interviews, to classify numerous applicants. Hence, most AI recruitment systems currently in practice take the form of triaging systems, involving human intervention at some stage.

The challenge lies in the fact that unless AI independently makes decisions at every stage of the process, such a system is unlikely to be deemed a completely automated system in the PIPA (Park, 2021, p. 46). This is because if the criteria of a completely automated system stated in Article 37-2 were applied to these instances, most triaging systems would fall outside the scope of the provision due to their incorporation of human involvement in certain parts. That is because the current administrative interpretation of Article 20 of the GAPA, which parallels Article 37-2 of the PIPA, does not classify a partially automated system as a ‘completely automated’ one (Ministry of Government Legislation of Korea, 2021b, p. 80). This is a regulatory gap unique to the PIPA, unlike the GDPR.

To address the issues inherent in the wording of the PIPA, the PIPC has issued guidance assessing whether human intervention has occurred in the context of individual decisions rather than from a systemic perspective (Personal Information Protection Commission of Korea, 2024, pp. 24–25). For example, the guidance states that if an applicant goes only through the AI interview stage of the recruitment process and is rejected, the decision is assessed as having been made by a completely automated system, regardless of whether there was human intervention in downstream decisions.

This conclusion follows from the provisions of the GDPR; however, not from those of the PIPA, which specifies an automated system, rather than a decision, as its target. According to the PIPC’s guidance, for an AI interview decision to be evaluated as a decision by a completely automated system under the PIPA, the AI interview stage must be considered an automated system separate from the other parts of the recruitment process. However, AI interview decisions are only intermediate steps within the recruitment process, making it difficult to evaluate it as an independent system. If each decision-making step in a multi-stage profiling system is considered an independent ‘subsystem’, it leads to the unrealistic conclusion that legal intervention can be recognised in all intermediate decisions.

Things are further complicated when considering whether a decision has significant legal implications. According to the PIPC’s guidance, the intermediate decisions of the AI interviews would be isolated to determine whether those decisions affect any rights or duties. There may be an assessment that this has a significant effect on legal rights or duties; however, some may assess that, although significant, it only has a de facto effect rather than a legal one. If an intermediate decision in a particular AI interview is assessed to have a significant de facto effect in the specific context, then the GDPR may protect it, whereas the PIPA would not.

To sum up, the regulatory gap arising from the difference in wording between Article 22 of the GDPR and Article 37-2 of the PIPA is unreasonable as there is no distinction in the degree of fundamental rights protection required by both legislations. Furthermore, to the best of our knowledge, there is no basis to believe that this is due to legislative intent. In this context, a commentary published by a national agency in Korea acknowledges that there is room for improvement in Article 20 of the GAPA (Ministry of Government Legislation of Korea, 2021a, p. 211).

Therefore, given the prevalence of automated systems, it is necessary to introduce revised legislation or an interpretation of the target of Article 37-2 in alignment with the PIPA’s purpose. By amending or interpreting the PIPA in this manner, an additional advantage can be achieved by aligning the PIPA and its Enforcement Decree. The Enforcement Decree, without specifying human involvement, requires personal information controllers to (1) disclose procedures for processing the main personal information in advance (Article 44-4(1)(3)) and (2) provide an explanation, including the procedures in which automated decisions are made, upon request from data subjects (Article 44-3(2)(4)) for automated decisions. In an automated triaging system involving partial human intervention, it is possible to protect fundamental rights by establishing a dual system in which data subjects identify human involvement through the latter and raise an objection through the former. With these modifications, the regulatory gap in the PIPA concerning triaging systems can be addressed without escalating regulatory costs.

Content: Establishing the right to explanation

Differences between the GDPR and PIPA

The recently revised PIPA introduces provisions for the rights to object (Article 37-2(1)) and explanation (Article 37-2(2)). Throughout the legislative process, there were no objections raised regarding the granting of the right to object under Article 37-2(1), except for concerns about potential conflicts with Article 20 of the GAPA. Thus, Article 37-2(1) was finally enacted with the phrase ‘excluding automatic disposition made by an administrative authority pursuant to Article 20 of the General Act on Public Administration’ in parentheses.Footnote 5

However, there is still considerable controversy over the meaning of Article 37-2(2), from the legislative process to the present. One of the main challenges is that the GDPR, on which the PIPA is referenced, does not stipulate ‘the right to explanation’ for automated decisions in its articles. Article 22 of the GDPR does not specify the right, and only its Recital 71 identifies ‘the right to obtain an explanation of the decision’ as one of the suitable safeguards for decisions based solely on automated processing. In response, endeavours have been made to establish the right to explanation at a ‘local’ and ‘ex post’ level, targeting individual decisions, based on the aforementioned and other provisions of the GDPR (Goodman and Flaxman, 2017, p. 55; Selbst and Powles, 2017, pp. 235–237), such as notification duties (Articles 13(2)(f) and 14(2)(g)) and the right of access (Article 15(1)(h)).

However, as this argument does not align with the GDPR framework and the legislator’s intention, a compelling counterargument suggests that only the right to provide information, instead of the right to explanation, can be recognised through these provisions (Wachter et al., 2017). According to this counterargument, the nature of the explanation required by the GDPR is limited to being an ‘ex ante’, preceding a specific decision. Furthermore, as the decision does not exist at this juncture, the explanation must necessarily be a ‘global’ one, encompassing the functionality of the entire system rather than a specific decision.

Unlike the GDPR, the PIPA explicitly enshrines ‘the right to explanation’ in its provisions. However, Article 37-2(2) of the PIPA does not specify the contents of the explanation and delegates the matter to a presidential decree as per Article 37-2(5). Accordingly, the Enforcement Decree of the PIPA outlines the details to be included in the explanation, as stated in Article 44-3(2) and each subparagraph of the provision.

First, the explanation must be provided concisely, meaningfully, and in an easily understandable manner for the data subject (Personal Information Protection Commission of Korea, 2024, p. 26). This aligns with the requirements outlined in the GDPR, namely: providing (1) information in an ‘easily visible, intelligible and clearly legible manner’ under Articles 12(7) and 15(1)(h) and (2) ‘meaningful information about the logic involved’ under Articles 13(2)(f) and 14(2)(g). Hence, the right to explanation in Article 37-2(2) of the PIPA appears to have the characteristics of ex ante, global explanation.

Second, the elements of explanation required by the Enforcement Decree of the PIPA are (1) the result of the relevant automated decision, (2) the types of major personal information used for the relevant automated decision, (3) the major criteria for automated decisions, and (4) the procedures in which automated decisions are made. These correspond to local information after a specific automated decision is made. In contrast, the information required by Articles 13(2)(f), 14(2)(g), and 15(1)(h) of the GDPR pertains to ex ante and global information, which can be provided at the point of collecting personal data. Hence, the right to explanation in Article 37-2(2) of the PIPA should be perceived as having ex post, local nature, which is reinforced by the presence of a distinct provision for the duty of information disclosure with ex ante, global nature in Article 37-2(4).

Third, Article 37-2(2) of the PIPA does not require the decision to ‘have a significant effect on his or her right or duty’, as stipulated in Article 37-2(1), which broadens the scope of protection by relaxing the requirements compared to Article 22 of the GDPR (Lee, 2024b, p. 46). If the wording of the clause is strictly applied, even trivial applications, such as facial photo correction using AI, may be considered an automated decision based on personal information processing, invoking the right to explanation. Given the comprehensive nature of the concept of ‘automated decision’, significant regulatory costs may arise in modern societies where AI is widely used. To address this issue, the Enforcement Decree of the PIPA stipulates that for automated decisions that do not have a significant effect on data subjects’ rights or duties, explanations can be replaced with information by fulfilling the disclosure duty (Articles 44-3(2) and 44-4(1)).

Assessment of the PIPA regulation

For the right to explanation to be effective, the information provided by personal information controllers, as outlined in Article 44-3(2) of the Enforcement Decree of the PIPA, must be clearly defined. Among these elements, the most difficult to legislate are the types of major personal information and the major criteria for automated decisions. The result is relatively straightforward as it can be directly conveyed from an automated decision, and the procedures can be briefly provided as they are common across various decisions. However, the types of major personal information and the major criteria for automated decisions are generally more complicated. This complexity arises because automated systems, including AI, may be opaque regarding the attributes that these systems consider ‘major’.

This raises questions about Article 36-2(1) of the CIA, which has already introduced similar provisions to the right to explanation in the PIPA. The CIA defines the elements that must be explained as the results, major criteria, and outline of the underlying information of automated evaluation and other matters similar to the previous three elements prescribed by the Presidential Decree. In addition, specific examples that must be included in the explanation are outlined in Article 31-2 of the Enforcement Decree of the CIA. The first three elements (results, major criteria, and outline of the underlying information) specified in the CIA and the Enforcement Decree of the PIPA overlap. Hence, the common practice of seeking explanations for the reasons for current credit ratings and downgrades, methods for improving credit ratings, and variations in ratings between Credit Bureaus (National Assembly Research Service of Korea, 2022, p. 3) could significantly influence the interpretation of the right to explanation in the PIPA.

However, from 2018 to the first half of 2022, the right to explanation was exercised 36,224 times against Credit Bureaus, which is less than 1% of 3,809,069 individual credit score inquiries made during the same period (National Assembly Research Service of Korea, 2022, pp. 3–4). This is surprising considering the significant effect that personal credit ratings can have on credit data subjects. For most automated decisions, which are more trivial and routine than personal credit ratings, the right to explanation is likely to be exercised much less frequently. Hence, in terms of regulatory costs, relatively straightforward standards that reserve a portion of normative judgement until the application stage are appropriate (Kaplow, 1992, pp. 571–577).

Nevertheless, it is crucial not to overlook the CIA’s limited application to personal credit ratings. The basis of personal credit ratings, represented by the FICO scores, has evolved from simple traditional tools to complex systems utilising big data. However, the essence remains as classification models evaluating an individual’s creditworthiness or loan repayment ability (Hurley and Adebayo, 2016, pp. 162–165). Given that the tasks of credit evaluation for classification are straightforward, the automated systems used for this purpose are clear and transparent. In the context of credit ratings, even when AI is applied, model-agnostic explanations can often provide sufficient information, and there is significant potential for using inherently interpretable models (Dejamo et al., 2020, pp. 186–188).

In contrast, Article 37-2 of the PIPA is a general provision that is not limited to specific fields, and deep learning-based AI, which has recently been used in comprehensive fields, is mostly based on black-box models whose internal design cannot be easily understood (Adadi and Berrada, 2018, p. 52141). AI systems typically consist of numerous internal layers and parameters that interact through complex mathematical methods to make a decision. Unlike a simple automated system, complex AI, often referred to as a black-box, may struggle to provide clear explanations for the factors contributing to a specific decision. In cases such as large language models, automated systems employing AI models with hundreds of billions of parameters may require significant additional computation to extract ‘major’ information. Given the AI’s technical characteristics, if the content of an explanation is regulated to involve relatively complicated ex post and local information, it could pose challenges in practical implementation.

For these standards to serve as appropriate guidance, it is essential to ensure that data subjects understand the objectives of each provision. The PIPA’s provisions and its subordinate legislation do not sufficiently specify such objectives, leading to confusion. One of the primary purposes of the PIPA is to implement the right to informational self-determination; however, merely requiring explanations for automated decisions does not enable data subjects to fully control the right. For data subjects to exercise their right to informational self-determination vis-à-vis automated decisions, they must be able to take subsequent measures based on the provided explanations. To achieve this, data subjects must be able to adjust their circumstances to achieve the desired outcomes, directly raise objections with personal information controllers, exercise the right to object based on Article 37-2(1), and/or take other legal actions. By exercising the right to explanation, data subjects must be able to independently assess the validity of automated decisions and decide on appropriate measures if they believe that the decisions are invalid.

Furthermore, Article 37-2(2) of the PIPA broadens the scope of explanation by not limiting it to the effects on a data subject; however, it does not specify the details of the follow-up measures for the data subject. This creates uncertainty regarding the purpose of the provision as it does not guide what is required for an explanation to be considered sufficient. To address this uncertainty, Article 44-3(3) of the Enforcement Decree recognises the right to contest as a specific example of ‘(explanation,) etc.’ in Article 37-2(2) of the PIPA. Therefore, the right to explanation must ensure that a data subject has enough information to exercise the right to contest. However, at the same time, the decree allows for the explanation to be substituted with information in Articles 44-4(1)(2) and (3) of the Enforcement Decree in cases where the decision does not have a significant effect on the data subject’s rights or duties (Article 44-3(2)). Considering that the information from the disclosure duty in Article 44-4(1) is unlikely to provide sufficient information to exercise the right to contest, making an exception in Article 44-3(2) of the Enforcement Decree is inappropriate.

To address this confusion, it is important to clarify that the right to explanation in the PIPA should serve to provide information for subsequent measures, such as objecting or contesting the automated decision. In this context, it may be helpful to examine the information content of the GDPR and WP29 for disclosure. The information subject to disclosure, as outlined in Articles 13(2)(f) and 14(2)(g) of the GDPR, corresponding to Article 37-2(4) of the PIPA, can be categorised into: (1) the existence of automated decision-making, (2) meaningful information on the logic involved, and (3) the significance and envisaged consequences of such processing. The WP29 supplements the two articles of the GDPR by specifying that the information to be provided in advance includes the main characteristics considered in reaching the decision, the source of this information and its relevance, the significance and envisaged consequences of processing, and tips on how to improve the result (WP29, 2017, pp. 25–26).

In the future, as the right to explanation continues to be actively exercised in specific sectors, the government may be able to provide guidelines based on accumulated precedents, facilitating the enactment of regulations tailored to particular sectors. Such sector-specific regulations could include technical regulatory provisions that consider the unique characteristics of the automated systems used in those fields. For instance, in recent years, there have been numerous efforts to address AI’s opacity through the development of explainable AI (XAI), including projects led by the US Defense Advanced Research Projects Agency (Adadi and Berrada, 2018, pp. 52139–52142; Gunning et al., 2021, pp. 1–2). However, due to the normative nature of law, which primarily uses legal language rather than technical language, there are various limitations in incorporating specific technologies into legal regulations (Xiang, 2021, p. 656). Therefore, further research is needed to bridge the gap between XAI and legal regulations.

Concluding remarks

In summary, the current PIPA and its subordinate legislation offer insufficient protection for fundamental rights when compared to the GDPR in certain ways. First, regarding the format, by granting the right to object rather than establishing a general prohibition to automated decisions, the PIPA has limitations in protecting individuals who are unable to effectively exercise their rights for various reasons. Second, in terms of the target, by regulating a completely automated status at the overall system level, the PIPA creates a regulatory vacuum for a multi-stage profiling system. Third, concerning the content, several technical and practical limitations remain in specifying the content of the right to explanation in the PIPA. While these concerns may be excessive and hinder innovation in AI technology and industry, the risk that the decline in AI’s trustworthiness will reduce its social acceptance cannot be overlooked. Thus, keeping these considerations in mind when updating the Presidential Decree and guidelines, the PIPA should harmonise the protection of fundamental rights and the use of personal information.

This study argues that the PIPA should provide fundamental rights protection comparable to the GDPR. The PIPA’s limitations, arising from amendments to expand the usability of personal information, can be partially supplemented through the widely recognised right to explanation. However, due to the content of the Presidential Decree that narrows the scope of the right to explanation, it is unable to effectively perform this role. Depending on individual cases, establishing legislation, updating presidential decrees and guidelines, or developing interpretation theory through case law may be the best solution.

However, it is not necessary to present a solution within the scope of PIPA. Just as the CIA plays a unique role in credit ratings, special legal provisions could be introduced that are optimised to cover areas of automated decisions. Overcoming the problem through AI legislation could be one possible solution. The EU AI Act complements the GDPR on automated decisions and profiling. For example, the AI Act categorises any AI system that profiles natural persons used by or on behalf of law enforcement authorities as a high-risk AI system, imposing more stringent regulations than the GDPR. In addition, various proposals have been put forth to effectively safeguard fundamental rights regarding automated decisions based on the relationship between the two laws (Joo, 2023). In Korea, several bills aimed at regulating AI have been proposed (Ahn, 2023; Hwang, 2023); however, it may take time to reach a concrete agreement comparable to the EU. Given that the PIPA explicitly targets a system utilising AI in its legal framework, coordination with AI legislation is expected to be essential going forward.