Finally, the ECJ is interpreting Article 22 GDPR (on individual decisions based solely on automated processing, including profiling)

Alessandra Silveira (Editor)
           

1) What is new about this process? Article 22 GDPR is finally being considered for before the European Court of Justice (ECJ) – and on 16 March 2023, the Advocate General’s Opinion in Case C-634/21 [SCHUFA Holding and Others (Scoring)][1] was published. Article 22 GDPR (apparently) provides a general prohibition of individual decisions based “solely” on automated processing – including profiling – but its provisions raise many doubts to the legal doctrine.[2] Furthermore, Article 22 GDPR is limited to automated decisions that i) produce effects in the legal sphere of the data subject or that ii) significantly affect him/her in a similar manner. The content of the latter provision is not quite clear, but as was suggested by the Data Protection Working Party (WP29), “similar effect” can be interpreted as significantly affecting the circumstances, behaviour or choices of data subjects – for example, decisions affecting a person’s financial situation, including their eligibility for credit.[3] To this extent, the effectiveness of Article 22 GDPR may be very limited until EU case law clarifies i) what a decision taken solely on the basis of automated processing would be, and ii) to what extent this decision produces legal effects or significantly affects the data subject in a similar manner.

2) Why is this case law so relevant? Profiling is an automated processing often used to make predictions about individuals – and may, or may not, lead to automated decisions within the meaning of the Article 22(1) GDPR. It involves collecting information about a person and assessing their characteristics or patterns of behaviour to place them in a particular category or group and to draw on that inference or prediction – whether of their ability to perform a task, their interest or presumed behaviour, etc. To this extent, such automated inferences demand protection as inferred personal data, since they also make it possible to identify someone by association of concepts, characteristics, or contents. The crux of the matter is that people are increasingly losing control over such automated inferences and how they are perceived and evaluated by others. The ECJ has the opportunity to assess the existence of legal remedies to challenge operations which result in automated inferences that are not reasonably justified. As set out below, the approach adopted by the Advocate General has weaknesses – and if the ECJ adopts the conditions suggested by the Advocate General, many reasonable interpretative doubts about Article 22 GDPR will persist.

3) What questions does Article 22 GDPR raise?  Does this Article provide for a right or, rather, a general prohibition whose application does not require the party concerned to actively invoke a right?  What is a decision based “solely” on automated processing? (which apparently excludes “largely” or “partially” but not “exclusively” automated decisions). Will the provisions of Article 22 GRPD only apply where there is no relevant human intervention in the decision-making process? If a human being examines and weighs other factors when making the final decision, will it not be made “solely” based on the automated processing? [and, in this situation, will the prohibition in Article 22(1) GDPR not apply]?

4) Why is the distinction between a right and a general prohibition relevant here? As a general prohibition, it would apply regardless of whether or not the data subject takes an action relating to the processing of their data. The interpretation of Article 22(1) GDPR as a general prohibition, rather than a right that can be invoked by the data subject, means that individuals are automatically protected from the possible effects of such processing.

5) What is the importance of a relevant human intervention in the decision-making process? Article 22(1) GDPR (apparently) implies a general prohibition; however, there are exceptions to this prohibition. According to Article 22(2) GDPR, the prohibition does not apply if the decision i) is necessary for the conclusion or performance of a contract between the data subject and a controller; ii) is authorized by EU or Member State law to which the controller is subject; iii) is based on the data subject’s explicit consent. In any way, in situations where such a general prohibition does not apply, the data controller must take appropriate measures to safeguard the rights and interests of the data subject [Article 22(3) GDPR], in particular the right to i) obtain human intervention from the controller, ii) express his/her point of view, and iii) challenge the decision. According to Recital 71 GDPR – which serves as an interpretative benchmark for the provisions of the GDPR –, the suitable safeguards should include the right to obtain “an explanation of the decision reached after such assessment and to challenge the decision.” To this extent, any review of the automated inference must be carried out by someone with the appropriate authority and competence to understand and change the result. The aim of the legislature is (can only be) to prevent decision-making from taking place without individual assessment and evaluation by a human being.[4]

6) What prompted this request for a preliminary ruling? Case C-634/21 concerns proceedings between a citizen (the applicant OQ, data subject) and the Land Hessen (Germany) regarding the protection of personal data. The Land is represented by the Hesse Commissioner for Data Protection and Freedom of Information (HBDI) and the joined party SCHUFA Holding AG (SCHUFA) – a private German credit information agency that provides its contractual partners with information on the creditworthiness of consumers using mathematical statistical methods. SCHUFA provided a financial entity with a credit score for the applicant OQ, which served as the basis for refusing to grant the credit for which the latter had applied. The applicant requested that SHUFA provide her with information regarding the data stored and, moreover, erase what she considered to be incorrect entries; she stated that SCHUFA is obliged to provide information about the logic involved, as well as the significance and consequences of the processing. However, SCHUFA merely informed her of the relevant score and, in general terms, of the principles underlying the calculation method of the score, without informing her of the specific data included in that calculation, or of the relevance attributed to them in that context, arguing that the calculation method was a trade secret.

7) What doubts led the national court to refer it back to the ECJ? According to the referring court, it is ultimately the credit score established by credit information agencies that actually decides whether and how a financial entity/bank enters into a contract with the data subject. The referring court assumes that the establishment of a score by a credit information agency does not merely serve to pave the way for that bank’s decision – but constitutes an independent “decision” within the meaning of Article 22(1) of the GDPR. Thus, by providing that a data subject has the right “not to be subject to a decision based solely on automated processing, including profiling”, the referring court finds that Article 22(1) of the GDPR establishes a causal link and a chronologically fixed sequence between the automated processing (including profiling) and then the decision based on it.[5]

8) What will the ECJ have to clarify, from the Advocate General’s perspective? The Advocate General Priit Pikamäe considers that the ECJ is called upon to rule i) on the restrictions that the GDPR imposes on the economic activity of credit information agencies (in particular in data management) and ii) on the effect to be conferred to trade secrets. Similarly, the ECJ would have to clarify the scope of the regulatory powers that certain provisions of the GDPR bestow on the national legislature, namely the exception to the prohibition in Article 22(2)(b) GDPR – according to which such prohibition does not apply if the decision is authorized by EU or Member State law to which the controller is subject. This is relevant because, if Article 22(1) GDPR were to be interpreted as meaning that the establishment of a credit score is an independent decision within the meaning of Article 22(1) of the GDPR, that activity would be subject to the prohibition of automated individual decisions – and consequently, it would require a legal basis under Member State law within the meaning of Article 22(2)(b) of the GDPR.

9) What does the Advocate General essentially suggest? The Advocate General concluded that the automated establishment of a probability value concerning the data subject’s ability to provide a loan (profiling) already constitutes a decision based solely on automated processing. But it will only be so where that value (in the referred case, a credit score) determined by means of personal data (in case, the applicant OQ) is transmitted by the controller (in case, the company SCHUFA) to a third-party controller (in case, a financial entity) and the latter, in accordance with consistent practice, draws strongly on that value for its decision (on the establishment, implementation or termination of a contractual relationship with the data subject). The expressions “consistent practice” and “draws strongly” signal some of the Advocate General’s difficulties in interpreting Article 22 which, if reproduced in the forthcoming ECJ decision, could largely undermine the impact of this case law.

10) Why is that? Because profiling always includes inferences and predictions about the individual, regardless the application of automated individual decisions based on profiling. The protection of individuals from automated inferences or predictions is still under development in the EU – and the “conditions” suggested by the Advocate General might undermine that evolution. According to the WP29, the GDPR introduces provisions intended to ensure that profiling and automated decisions (regardless of whether they include profiling or not) do not produce an unjustified impact on individuals’ rights. In this sense, for the WP29, the GDPR would not only apply to data collection but also to the application of profiling to individuals. It follows that profiling would involve three distinct phases: i) data collection; ii) automated analysis to identify correlations; and iii) applying the correlations to an individual to identify present or future behavioural characteristics. If there were perhaps automated individual decisions based on profiling, these would also be subject to the GDPR – whether exclusively automated or not. That is, for the WP29, profiling is not limited to the mere categorization of the individual, but it also includes inferences and predictions about the individual. In any case, according to the WP29, automated decisions can be based on any kind of data, be it i) data provided directly by the individual (such as answers to a questionnaire concerning name, postal address, e-mail address, etc.); ii) observed data about the individual (such as location data collected through an application accessed by the individual); iii) data obtained or inferred (such as a profile of the individual created for the purposes of a credit score).[6]

11) Why does this have relevance for the protection of fundamental rights? Data must be processed fairly for specified purposes [Article 8(2) CFREU]. However, contrary to these principles, profiling tends to cover the use of personal data collected for other purposes. Thus, profiling risks being abusive and leading to discrimination, preventing individuals from accessing employment/credit/insurance opportunities, or offering financial products with excessive risks or costs. WP29 exemplifies a situation potentially in breach of the GDPR: a data broker sells consumer profiles to financial entities, without consumers’ consent or knowledge of the underlying data. Profiles define consumers through categories (for example, “small town resident and ethnic difficulties”, “rough start in life: young single parents”, etc.) or assign them scores based on the consumers’ financial vulnerability. The financial entity then offers these consumers high-cost loans and other financially risky products that make it impossible for them to access the resources they want. This is where the perplexity lies: would it be possible to adequately challenge such an automated operation on the basis of the GDPR?

12) How sound is the Advocate General’s Opinion? It seems to recognise that a restrictive interpretation of Article 22 GDPR would generate a gap in the legal protection, because this would disoblige SCHUFA to provide the information required under Article 15(1)(h) of the GDPR – given that, allegedly, that company would have not adopted its own “automated decision” within the meaning of GDPR. On the other hand, the financial entity which “formally adopted the decision” on the basis of the automated score, would not be able to provide it because it would not have that information. Thus, making the credit information agency liable by virtue of the generation of the score – and not by virtue of its subsequent use – would be the most effective way of ensuring the protection of the data subject’s fundamental rights.

13) What obligations would SCHUFA have towards the data subject? Given that SCHUFA has refused to disclose to the applicant information relating to the calculation method, on the ground that it constitutes trade secrets, the Advocate General considered it appropriate to clarify the scope of the right of access referred to in Article 15(1)(h) GDPR, in particular as regards the existence of automated decisions – which entails the obligation to provide meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject.  This includes sufficiently detailed explanations about the method used to calculate the score and the reasons for a given result – i.e., the factors taken into account for the decision-making process, and their respective weight at an aggregate level –, which is useful for the data subject to challenge any decision within the meaning of Article 22 GDPR. Recital 63 GDPR ensures that the right of access should not adversely affect the rights or freedoms of others, including trade secrets or intellectual property – and in particular the copyright protecting the software. However, the Advocate General warns that controllers cannot invoke the protection of their trade secrets as a pretext to deny access or refuse to provide information to the data subject. Additionally, the doctrine has been advocating that the high levels of precision of the data mining and machine learning techniques, for example, have nothing to with the software, because it is the raw data and not the software that drives the operation.[7] 

14) Where does the major legal difficulty in this case lie? It lies in the weaknesses regarding the protection of inferred data. Despite the praiseworthy exegetical effort carried out by the WP29 – according to which the GDPR would apply to all profiling and automated individual decisions, whether they are based on provided data, observed data, or inferred data – what appears to be true is that the effectiveness of the application of the GDPR to inferred data faces several obstacles. This has to do with fact that the GDPR was designed for data provided directly by the data subject – and not for data inferred by digital technologies such as AI systems. This is the difficulty behind the Advocate General’s Opinion. The definition of “profiling” in the GDPR [Article 4(4)] refers to “any form of automated processing” – and not “solely” automated, as embodied in Article 22 GDPR. It follows that the issue of profiling is not exhausted in Article 22 GDPR, as this provision only concerns exclusively automated decisions, i.e., decisions adopted through technological means without relevant human intervention.  Thus, there may be decisions based on profiling that are not exclusively automated. Moreover, automated decisions may be performed with or without profiling, and profiling may occur without automated decisions. So, there is profiling which does not lead to an exclusively automated decision disputable under Article 22 GDPR – and deserves equal protection. However, the legislature clearly intended not to regulate the admissibility of profiling under data protection law independently via Article 22(1) GDPR, but only to address it in that respect in conjunction with another element, so to speak, in so far as it forms part of a decision based on an automated decision. This follows from the very wording of the provision, which, for the purposes of the prohibition laid down therein, largely focuses on decisions based on profiling – or other form of automated data processing – but not on profiling itself.[8] Nevertheless, and according to recital 72 RGPD, profiling is subject to the rules of that Regulation governing the processing of personal data, such as the legal grounds for processing or data protection principles.

15) Why did the legislator leave gaps to be filled by the judge? As already explained by computer engineers, in the processes of exploration and mining of large data sets via data mining and machine learning, any decision that does not require any human control to extract the outputs inferred by a learning agent is considered to be exclusively automated. This is why the GDPR requires that the effects on the legal sphere of the data subject be more than trivial, otherwise the learning algorithms would be left without the raw material to evolve – and technological development would be compromised. To this extent, some computer engineers raise many doubts about the feasibility of the provisions of the GDPR on this matter, because the fuzzy logic that underlies AI systems would not allow the average person to understand the inference process. The processing operations within AI make use of analytical models whose approximate predictions externalize fuzzy arguments that accept different degrees of truth (almost, maybe, somewhat) and not just the distinction between truth and falsehood.[9]  

16) What digital challenges does the ECJ face with this case law? In a global perspective, the problem of this case law lies on the opacity of inferences or predictions resulting from data analysis, particularly by AI systems – inferences whose application to everyday situations determines how each of us, as personal data subjects, are perceived and evaluated by others. In this sense, ECJ’s interpretation of Article 22 GDPR provisions would still be far from solving the issue of inferences produced by AI systems. The ECJ is certainly aware of this, which is why – according to those who attended the Oral Hearing in Luxembourg on 26 January 2023 –, the Judge von Danwitz asked whether SCHUFA uses self-learning algorithms to establish the credit score, and whether SCHUFA considers that it is legally allowed to use such algorithms. In its response, SCHUFA stated it does not use self-learning algorithms currently but may do so in the future.[10] The cornerstone as regards inferred data lies in the intelligibility/explainability/reasonableness of the operation from which a digital system has produced a given inference. Ultimately, it lies in justification – without which it is impossible to effectively challenge the outcome of the processing of personal data, be it profiling or the automated decision resulting thereof. This may depend on the reconsideration of exploitation and profiling technologies, in order to make them more intelligible to the data subject of the inferred data. Because in the legal sphere, it is not enough to inform, it is necessary to explain; and it is also not enough to explain, it is necessary to justify.

17) What role can the ECJ be expected to play in upholding fundamental rights? Although the Articles of the GDPR do not expressly refer to an obligation of explainability, this is what would be required in light of the systematic interpretation of that Regulation, taking into account in particular recital 71 GDPR (which refers to the right “to obtain an explanation of the decision reached after such assessment”), Article 5(1)(a) GDPR (lawfulness, fairness and transparency in relation to the data subject) and Article 12(1) GDPR (to provide the data subject with information in a concise, transparent, intelligible and easily accessible form). Most importantly, this is the interpretation of the secondary law compatible with the primary EU law, insofar as Article 8(2) CFREU states that “everyone has the right of access to data which has been collected concerning him or her, and the right to have it rectified.” For the interpretation of this provision, it is important to emphasize that the meaning of “collect” is, precisely, “to gather into a collection what is scattered, to infer, to deduce” – and regarding the data thus collected, the legal basis highlighted by the CFREU is related to the fairness, purpose, consent, and lawfulness of its processing, in addition to the holder’s access to personal data to obtain their rectification. To rectify means to make straight, align, correct, amend – and, in a broader sense, to respond to a less accurate assertion in order to restore the truth of the facts.

An extended version of this text entitled “Automated individual decision-making and profiling [on case C-634/21 – SCHUFA (Scoring)]” will be published in the forthcoming UNIO – EU Law Journal, vol. 8, No. 2 (2023).


[1] Advocate General’s Opinion in Case C-634/21, SCHUFA Holding and Others (Scoring), ECLI:EU:C:2023:220, 16 March 2023.

[2] On this subject see Alessandra Silveira, “Profiling and cybersecurity: a perspective from fundamental rights’ protection in the EU”, in Legal developments on cybersecurity and related fields, ed. Francisco Andrade/Joana Covelo Abreu/Pedro Freitas (Springer International Publishing, forthcoming); Sandra Wachter and Brent Mittelstadt, “A right to reasonable inferences: re-thinking data protection law in the age of big data and AI”, Columbia Business Law Review, No. 2 (2019); Tiago Sérgio Cabral, “AI and the Right to Explanation: Three Legal Bases under the GDPR”, in Data Protection and Privacy, Volume 13: Data Protection and Artificial Intelligence, ed. Paul De Hert et al. (Hart Publishing: 2021).

[3] Data Protection Working Party (WP29), Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679, 2017. The WP29 (so called because it was established under the then Article 29 of Directive 95/46 that preceded the GDPR) ceased to function on May 25, 2018, with the start of the implementation of the GDPR. It was functionally succeeded by the current European Data Protection Board – whose main task is to ensure the effective and consistent application of the GDPR and other Union data protection legislation.

[4] See Request for a preliminary ruling of 1 October 2021, Case C-634/21, recital 26.

[5] See Request for a preliminary ruling of 1 October 2021, Case C-634/21, recital 23.

[6] Data Protection Working Party (WP29), Guidelines on Automated individual decision-making and Profiling…, cit.

[7] See César Analide and Diogo Morgado Rebelo, “Inteligência artificial na era data-driven, a lógica fuzzy das aproximações soft computing e a proibição de sujeição a decisões tomadas exclusivamente com base na exploração e prospeção de dados pessoais”, Fórum de proteção de dados, Comissão Nacional de Proteção de Dados, No. 6 (2019), Lisbon.

[8] See Request for a preliminary ruling of 1 October 2021, Case C-634/21, recital 20.

[9] See César Analide and Diogo Morgado Rebelo, “Inteligência artificial na era data-driven…”, cit.

[10] See Andreas Häuselmann, “The ECJ’s first landmark case on automated decision-making – a report from the Oral Hearing before the First Chamber”, European Law Blog, Blogpost 11/2023, 20 February 2023, available at: https://europeanlawblog.eu/2023/02/20/the-ecjs-first-landmark-case-on-automated-decision-making-a-report-from-the-oral-hearing-before-the-first-chamber/.

Picture credits: Andrew Neel on Pexels.com.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s