Finally, the ECJ is interpreting Article 22 GDPR (on individual decisions based solely on automated processing, including profiling)

Alessandra Silveira (Editor)
           

1) What is new about this process? Article 22 GDPR is finally being considered for before the European Court of Justice (ECJ) – and on 16 March 2023, the Advocate General’s Opinion in Case C-634/21 [SCHUFA Holding and Others (Scoring)][1] was published. Article 22 GDPR (apparently) provides a general prohibition of individual decisions based “solely” on automated processing – including profiling – but its provisions raise many doubts to the legal doctrine.[2] Furthermore, Article 22 GDPR is limited to automated decisions that i) produce effects in the legal sphere of the data subject or that ii) significantly affect him/her in a similar manner. The content of the latter provision is not quite clear, but as was suggested by the Data Protection Working Party (WP29), “similar effect” can be interpreted as significantly affecting the circumstances, behaviour or choices of data subjects – for example, decisions affecting a person’s financial situation, including their eligibility for credit.[3] To this extent, the effectiveness of Article 22 GDPR may be very limited until EU case law clarifies i) what a decision taken solely on the basis of automated processing would be, and ii) to what extent this decision produces legal effects or significantly affects the data subject in a similar manner.

2) Why is this case law so relevant? Profiling is an automated processing often used to make predictions about individuals – and may, or may not, lead to automated decisions within the meaning of the Article 22(1) GDPR. It involves collecting information about a person and assessing their characteristics or patterns of behaviour to place them in a particular category or group and to draw on that inference or prediction – whether of their ability to perform a task, their interest or presumed behaviour, etc. To this extent, such automated inferences demand protection as inferred personal data, since they also make it possible to identify someone by association of concepts, characteristics, or contents. The crux of the matter is that people are increasingly losing control over such automated inferences and how they are perceived and evaluated by others. The ECJ has the opportunity to assess the existence of legal remedies to challenge operations which result in automated inferences that are not reasonably justified. As set out below, the approach adopted by the Advocate General has weaknesses – and if the ECJ adopts the conditions suggested by the Advocate General, many reasonable interpretative doubts about Article 22 GDPR will persist.

3) What questions does Article 22 GDPR raise?  Does this Article provide for a right or, rather, a general prohibition whose application does not require the party concerned to actively invoke a right?  What is a decision based “solely” on automated processing? (which apparently excludes “largely” or “partially” but not “exclusively” automated decisions). Will the provisions of Article 22 GRPD only apply where there is no relevant human intervention in the decision-making process? If a human being examines and weighs other factors when making the final decision, will it not be made “solely” based on the automated processing? [and, in this situation, will the prohibition in Article 22(1) GDPR not apply]?

Continue reading “Finally, the ECJ is interpreting Article 22 GDPR (on individual decisions based solely on automated processing, including profiling)”

The GDPR may no longer be a paper tiger

Tiago Sérgio Cabral (Managing Editor). 

1. It is a known fact that the General Data Protection Regulation (GDPR) has suffered from an enforcement problem. The theoretical administrative fines of up to €20 000 000, or in the case of an undertaking, up to 4 % of the total worldwide annual turnover of the preceding financial year, whichever is higher, that appear impressive on paper largely failed to properly materialize in the first few years of application of the “new” data protection framework.

2. Fines under the GDPR finally overcame the €1 billion threshold in 2021, a sevenfold increase from 2021. In fact, fines under the GDPR have been steadily growing since 2018. Of course, one should not forget that a significant percentage of the total amount of fines levied in 2021 is comprised by the €746 million fine levied by Luxembourg Data Protection Supervisory Authority (DPA) against Amazon and the €225 million fine levied by the Irish DPA against Whatsapp. In addition, the total amount of the fines still pales in comparison with other areas, such as competition law.

Continue reading “The GDPR may no longer be a paper tiger”

Evaluating the legal admissibility of data transfers from the EU to the USA

Alessandra Silveira (Editor) and João Marques (Lawyer, former member of Portuguese Data Protection Supervisory Authority)

1. The feud between Maximillian Schrems and the Irish Data Protection Supervisory Authority (Data Protection Commission – DPC), with Facebook always lingering in, has been detrimental to frame the legality of data flows from the European Union (EU) to the United States of America (USA), but also to any third country that replicates the shortcomings relating to the inexistence of a “level of protection essentially equivalent to that guaranteed within the European Union (…), read in the light of the Charter of Fundamental Rights of the European Union” [in the words of the Court of Justice of the European Union (CJEU)].[1]

2. The sole action of one man has brought down two different and sequential “transfer tools”, created in tandem by both the European Commission (EC) and the United States’ Government. In case C-362/14 the CJEU declared the Safe Harbour decision (Commission Decision 2000/520/EC of 26 July 2000) invalid, as the Court found that the USA’s legislation did not offer an essentially equivalent level of protection to that of the EU, also reminding all Data Protection Supervisory Authorities that their work is never done and that it is, in fact, upon their shoulders the task and the responsibility to constantly monitor if any given third country complies and remains compliant with the need to offer such an equivalency.

Continue reading “Evaluating the legal admissibility of data transfers from the EU to the USA”

Editorial of December 2021

By Alessandra Silveira (Editor)

AI systems and automated inferences – on the protection of inferred personal data

On 23 November 2021 the European Commission published the consultation results on a set of digital rights and principles to promote and uphold EU values in the digital space – which ran between 12 May and 6 September 2021.[1] This public consultation on digital principles is a key deliverable of the preparatory work for the upcoming “Declaration on digital rights and principles for the Digital Decade”, which European Commission will announce by the end of 2021. The consultation invited all interested people to share their views on the formulation of digital principles in 9 areas: i) universal access to internet services; ii) universal digital education and skills for people to take an active part in society and in democratic processes; iii) accessible and human-centric digital public services and administration; iv) access to digital health services; v) an open, secure and trusted online environment; vi) protecting and empowering children and young people in the online space; vii) a European digital identity; viii) access to digital devices, systems and services that respect the climate and environment; ix) ethical principles for human-centric algorithms.  

Continue reading “Editorial of December 2021”

Editorial of June 2021

By Tiago Sérgio Cabral (Managing Editor)

Data Governance and the AI Regulation: Interplay between the GDPR and the proposal for an AI Act

It is hardly surprising that the recent European Commission’s proposal for a Regulation on a European Approach for Artificial Intelligence (hereinafter the “proposal for an AI Act”) is heavily inspired by the GDPR. From taking note of the GDPR’s success in establishing worldwide standards to learning from its shortcomings, for example by suppressing the stop-shop mechanism (arguably responsible for some of its enforcement woes).[1]

The proposal for an AI Act should not be considered a GDPR for AI for one singular reason: there is already a GDPR for AI, and it is called the GDPR. The scope and aims of the proposal are different, but there is certainly a high degree of influence and the interplay between the two Regulations, if the AI Act is approved, will certainly be interesting. In this editorial we will address one particular aspect where the interplay between the GDPR and the AI act could be particularly relevant: data governance and data set management.

Before going specifically into this subject, it is important to know that the AI Act’s proposed fines have a higher ceiling than the GDPR’s: up to 30,000,000 euros or, if the offender is company, up to 6% of its total worldwide annual turnover for the preceding financial year (article 71(3) of the proposal for an AI Act). We should note, nonetheless, that this specific value is applicable to a restricted number of infringements, namely:

Continue reading “Editorial of June 2021”

Artificial intelligence: 2020 A-level grades in the UK as an example of the challenges and risks

by Piedade Costa de Oliveira (Former official of the European Commission - Legal Service)
Disclaimer: The opinions expressed are purely personal and are the exclusive responsibility of the author. They do not reflect any position of the European Commission

The use of algorithms for automated decision-making, commonly referred to as Artificial Intelligence (AI), is becoming a reality in many fields of activity both in the private and public sectors.

It is common ground that AI raises considerable challenges not only for the area for which it is operated in but also for society as a whole. As pointed out by the European Commission in its White Paper on AI[i], AI entails a number of potential risks, such as opaque decision-making, gender-based bias or other kinds of discrimination or intrusion on privacy.

In order to mitigate such risks, Article 22 of the GDPR confers on data subjects the right not to be subject to a decision based solely on automated processing which produces legal effects concerning them or similarly significantly affects them[ii].

Continue reading “Artificial intelligence: 2020 A-level grades in the UK as an example of the challenges and risks”

The “mandatory” contact-tracing App “StayAway COVID” – a matter of European Union Law

by Alessandra Silveira, Joana Covelo de Abreu (Editors) and Tiago Sérgio Cabral (Managing Editor)

1. During the previous week there as been plenty of controversy regarding a proposal by the Portuguese Government to make the installation of the App “StayAway COVID” (“App”) – a mobile contact-tracing application designed to fight the pandemic – mandatory for large sections of the population. While the Government appears to have backed down from this idea (for now) the issue of European Union Law (“EU Law”) has been surprisingly absent from most of the debate around a measure of this nature, even though it should be front and centre and precedes even the issue of constitutionality.

As we will show in this text, it is difficult to argue against the conclusion that this subject should be considered as a matter of EU Law – and, consequently, that this is a question of fundamental rights protected by the European Union (“EU”). In the EU’s legal framework, privacy and personal data protection are fundamental rights enshrined within Article 16 of the Treaty on the Functioning of the EU and Articles 7 and 8 of the Charter of Fundamental Rights of the EU (CFREU). Since it is a matter regulated at EU level, the EU’s standard of fundamental rights’ protection is applicable before and above even the national constitutional standards of protection[i]. So, this is not just a Portuguese constitutional problem that can be solved in the light of the Portuguese Constitution – it is an issue of relevance to all European citizens which needs to be resolved in the light of the EU´s (jus)fundamental standards (see Article 51 CFREU).[ii] It is important to be aware that the Court of Justice of the EU (“ECJ”), in the past, struck down constitutional provisions from Member States to ensure the adequate protection of fundamental rights of privacy and personal data protection[iii]. This is because all Member States do not have the same level of (jus)fundamental protection.

2. Under the current legal framework in the EU, enforcing the use of any contact-tracing application to the general public (or to large sections of the general public such as the entire population inserted within the labour market, academia, schools and public administration) would always face some serious challenges.

Continue reading “The “mandatory” contact-tracing App “StayAway COVID” – a matter of European Union Law”

Editorial of April 2020

scope-microscope-camera-experiment-sience

by Alessandra Silveira, Editor


Health-related personal data – regarding COVID-19 and digital surveillance

Article 9 of the Regulation (EU) 2016/679 – General Data Protection Regulation (hereinafter, “GDPR”) prohibits the processing of special categories of personal data, amongst them (and the ones relevant for the subject of this essay): genetic data; biometric data for the purpose of uniquely identifying a natural person; and data concerning health. However, this prohibition shall not apply if processing is necessary for the purposes of medical diagnosis; the provision of health care or treatment;  the management of health care systems; or pursuant to contract with a health professional, in accordance to point h), of Article 9/2 of GDPR and under the further conditions established in Article 9/3. In particular, the general prohibition shall not apply if the “processing is necessary for reasons of public interest in the area of public health, such as protecting against serious cross-border threats to health or ensuring high standards of quality and safety of health care and of medicinal products or medical devices”, under point i), of Article 9/2.
Continue reading “Editorial of April 2020”

Editorial of December 2019

data-protection-regulation-3413077_1920

by João Marques, member of the Portuguese Data Protection National Commission


Portuguese DPA won’t apply the country’s GDPR law

In spite of its nature[i], the GDPR leaves some room of manoeuvre to the Member States. This European legal instrument has even been called a hybrid[ii] between a directive and a regulation, precisely because there is a significant amount of issues where national legislation can in fact diverge from the general solutions the GDPR brings to the table. Although such leeway is not to be misunderstood for a “carte blanche” to the Member States, there is nevertheless a relevant part to be played by national legislators.

From the definition of a minimum legal age for children’s consent to be considered valid for its personal data to be processed (in relation to information society services), which can vary between 13 and 16 years of age, to the waiver on fines being applied to the public sector (Article 83, 7), there is a vast array of subjects left for the Member States to determine. In fact, a whole chapter of the GDPR[iii] is dedicated to these subjects, namely: Processing and freedom of expression and information (Article 85); Processing and freedom of expression and information (Article 86); Processing of the national identification number (Article 87); Processing in the context of employment (Article 88); Safeguards and derogations relating to processing for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes (Article 89); Obligations of secrecy (Article 90) and Existing data protection rules of churches and religious associations (Article 91).

Additionally, matters of procedural law, according to the Principle of Conferral (Article 5 of the Treaty on the European Union) are almost entirely left for Member States to regulate, with few exceptions such as the deadlines and the (in)formalities of the reply to a data subject rights request (Article 12) and, most notably, the one-stop shop procedure (instated in Article 60) and all its related and non-related issues that are undertaken by the European Data Protection Board, the new European Union Body provided by the GDPR (section 3 of Chapter VII).

The task that lied ahead of the Portuguese legislator, concerning the national reform of the Data Protection Law[iv], was therefore demanding but framed in a way that should have helped steer its drafting in a comprehensive and relatively straightforward manner[v].

The legislative procedure in Portugal took some time to be jumpstarted and it wasn’t until the 22nd of March 2018 that a proposal from the government was finally approved and forwarded to the Parliament, as this is a matter of its competence under Article 165(1)(b) of the Portuguese Constitution.
Continue reading “Editorial of December 2019”

Editorial of October 2019

13120237513_9273218bcd_o

 by Tamara Álvarez Robles, Lecturer at the University of Vigo


On the reform of national law on data protection: the special incorporation of digital rights in Spain

The reform of the Spanish Organic Law on Data Protection (LO 3/2018), to adapt it to the General Regulation of Data Protection has introduced together with the European requirements a catalogue of digital rights. Title X “Guarantee of digital rights” has meant, undoubtedly one of the biggest novelties to data protection regulations. It is composed of a set of Articles, from 79 to 97, which present, for the first time in the Spanish national legislative sphere, the new generation of digital rights[i], inter alia, right to Internet neutrality, right to digital security, right to digital education, protection of minors on the Internet, right to rectification on the Internet, right to privacy and use of digital devices in the workplace, right to digital disconnection in the workplace, right to digital testament.

The inclusion in-extremis of the present Title X, of digital rights, through amendment of the Congress of Deputies dated April 18, 2018, responds to the fundamental importance, to the ever-present and dominating reality of the Internet, which reaches all spheres of our lives. That is why, Organic Law 3/2018 in section IV of the Preamble already points to the involvement of public authorities through the provision of public policies (Article 9.2 SC) in order to make effective the catalogue of digital rights based on the Principle of Equality (Article 14 SC), stating that: “it is the responsibility of the public authorities to promote policies that make effective the rights of citizens on the Internet, promoting the equality of citizens and the groups in which they are integrated in order to possible the full exercise of fundamental rights in the digital reality”.
Continue reading “Editorial of October 2019”