Evaluating the legal admissibility of data transfers from the EU to the USA

Alessandra Silveira (Editor) and João Marques (Lawyer, former member of Portuguese Data Protection Supervisory Authority)

1. The feud between Maximillian Schrems and the Irish Data Protection Supervisory Authority (Data Protection Commission – DPC), with Facebook always lingering in, has been detrimental to frame the legality of data flows from the European Union (EU) to the United States of America (USA), but also to any third country that replicates the shortcomings relating to the inexistence of a “level of protection essentially equivalent to that guaranteed within the European Union (…), read in the light of the Charter of Fundamental Rights of the European Union” [in the words of the Court of Justice of the European Union (CJEU)].[1]

2. The sole action of one man has brought down two different and sequential “transfer tools”, created in tandem by both the European Commission (EC) and the United States’ Government. In case C-362/14 the CJEU declared the Safe Harbour decision (Commission Decision 2000/520/EC of 26 July 2000) invalid, as the Court found that the USA’s legislation did not offer an essentially equivalent level of protection to that of the EU, also reminding all Data Protection Supervisory Authorities that their work is never done and that it is, in fact, upon their shoulders the task and the responsibility to constantly monitor if any given third country complies and remains compliant with the need to offer such an equivalency.

Continue reading “Evaluating the legal admissibility of data transfers from the EU to the USA”

Editorial of December 2021

By Alessandra Silveira (Editor)

AI systems and automated inferences – on the protection of inferred personal data

On 23 November 2021 the European Commission published the consultation results on a set of digital rights and principles to promote and uphold EU values in the digital space – which ran between 12 May and 6 September 2021.[1] This public consultation on digital principles is a key deliverable of the preparatory work for the upcoming “Declaration on digital rights and principles for the Digital Decade”, which European Commission will announce by the end of 2021. The consultation invited all interested people to share their views on the formulation of digital principles in 9 areas: i) universal access to internet services; ii) universal digital education and skills for people to take an active part in society and in democratic processes; iii) accessible and human-centric digital public services and administration; iv) access to digital health services; v) an open, secure and trusted online environment; vi) protecting and empowering children and young people in the online space; vii) a European digital identity; viii) access to digital devices, systems and services that respect the climate and environment; ix) ethical principles for human-centric algorithms.  

Continue reading “Editorial of December 2021”

Editorial of June 2021

By Tiago Sérgio Cabral (Managing Editor)

Data Governance and the AI Regulation: Interplay between the GDPR and the proposal for an AI Act

It is hardly surprising that the recent European Commission’s proposal for a Regulation on a European Approach for Artificial Intelligence (hereinafter the “proposal for an AI Act”) is heavily inspired by the GDPR. From taking note of the GDPR’s success in establishing worldwide standards to learning from its shortcomings, for example by suppressing the stop-shop mechanism (arguably responsible for some of its enforcement woes).[1]

The proposal for an AI Act should not be considered a GDPR for AI for one singular reason: there is already a GDPR for AI, and it is called the GDPR. The scope and aims of the proposal are different, but there is certainly a high degree of influence and the interplay between the two Regulations, if the AI Act is approved, will certainly be interesting. In this editorial we will address one particular aspect where the interplay between the GDPR and the AI act could be particularly relevant: data governance and data set management.

Before going specifically into this subject, it is important to know that the AI Act’s proposed fines have a higher ceiling than the GDPR’s: up to 30,000,000 euros or, if the offender is company, up to 6% of its total worldwide annual turnover for the preceding financial year (article 71(3) of the proposal for an AI Act). We should note, nonetheless, that this specific value is applicable to a restricted number of infringements, namely:

Continue reading “Editorial of June 2021”

Artificial intelligence: 2020 A-level grades in the UK as an example of the challenges and risks

by Piedade Costa de Oliveira (Former official of the European Commission - Legal Service)
Disclaimer: The opinions expressed are purely personal and are the exclusive responsibility of the author. They do not reflect any position of the European Commission

The use of algorithms for automated decision-making, commonly referred to as Artificial Intelligence (AI), is becoming a reality in many fields of activity both in the private and public sectors.

It is common ground that AI raises considerable challenges not only for the area for which it is operated in but also for society as a whole. As pointed out by the European Commission in its White Paper on AI[i], AI entails a number of potential risks, such as opaque decision-making, gender-based bias or other kinds of discrimination or intrusion on privacy.

In order to mitigate such risks, Article 22 of the GDPR confers on data subjects the right not to be subject to a decision based solely on automated processing which produces legal effects concerning them or similarly significantly affects them[ii].

Continue reading “Artificial intelligence: 2020 A-level grades in the UK as an example of the challenges and risks”

The “mandatory” contact-tracing App “StayAway COVID” – a matter of European Union Law

by Alessandra Silveira, Joana Covelo de Abreu (Editors) and Tiago Sérgio Cabral (Managing Editor)

1. During the previous week there as been plenty of controversy regarding a proposal by the Portuguese Government to make the installation of the App “StayAway COVID” (“App”) – a mobile contact-tracing application designed to fight the pandemic – mandatory for large sections of the population. While the Government appears to have backed down from this idea (for now) the issue of European Union Law (“EU Law”) has been surprisingly absent from most of the debate around a measure of this nature, even though it should be front and centre and precedes even the issue of constitutionality.

As we will show in this text, it is difficult to argue against the conclusion that this subject should be considered as a matter of EU Law – and, consequently, that this is a question of fundamental rights protected by the European Union (“EU”). In the EU’s legal framework, privacy and personal data protection are fundamental rights enshrined within Article 16 of the Treaty on the Functioning of the EU and Articles 7 and 8 of the Charter of Fundamental Rights of the EU (CFREU). Since it is a matter regulated at EU level, the EU’s standard of fundamental rights’ protection is applicable before and above even the national constitutional standards of protection[i]. So, this is not just a Portuguese constitutional problem that can be solved in the light of the Portuguese Constitution – it is an issue of relevance to all European citizens which needs to be resolved in the light of the EU´s (jus)fundamental standards (see Article 51 CFREU).[ii] It is important to be aware that the Court of Justice of the EU (“ECJ”), in the past, struck down constitutional provisions from Member States to ensure the adequate protection of fundamental rights of privacy and personal data protection[iii]. This is because all Member States do not have the same level of (jus)fundamental protection.

2. Under the current legal framework in the EU, enforcing the use of any contact-tracing application to the general public (or to large sections of the general public such as the entire population inserted within the labour market, academia, schools and public administration) would always face some serious challenges.

Continue reading “The “mandatory” contact-tracing App “StayAway COVID” – a matter of European Union Law”

Editorial of April 2020

scope-microscope-camera-experiment-sience

by Alessandra Silveira, Editor


Health-related personal data – regarding COVID-19 and digital surveillance

Article 9 of the Regulation (EU) 2016/679 – General Data Protection Regulation (hereinafter, “GDPR”) prohibits the processing of special categories of personal data, amongst them (and the ones relevant for the subject of this essay): genetic data; biometric data for the purpose of uniquely identifying a natural person; and data concerning health. However, this prohibition shall not apply if processing is necessary for the purposes of medical diagnosis; the provision of health care or treatment;  the management of health care systems; or pursuant to contract with a health professional, in accordance to point h), of Article 9/2 of GDPR and under the further conditions established in Article 9/3. In particular, the general prohibition shall not apply if the “processing is necessary for reasons of public interest in the area of public health, such as protecting against serious cross-border threats to health or ensuring high standards of quality and safety of health care and of medicinal products or medical devices”, under point i), of Article 9/2.
Continue reading “Editorial of April 2020”

Editorial of December 2019

data-protection-regulation-3413077_1920

by João Marques, member of the Portuguese Data Protection National Commission


Portuguese DPA won’t apply the country’s GDPR law

In spite of its nature[i], the GDPR leaves some room of manoeuvre to the Member States. This European legal instrument has even been called a hybrid[ii] between a directive and a regulation, precisely because there is a significant amount of issues where national legislation can in fact diverge from the general solutions the GDPR brings to the table. Although such leeway is not to be misunderstood for a “carte blanche” to the Member States, there is nevertheless a relevant part to be played by national legislators.

From the definition of a minimum legal age for children’s consent to be considered valid for its personal data to be processed (in relation to information society services), which can vary between 13 and 16 years of age, to the waiver on fines being applied to the public sector (Article 83, 7), there is a vast array of subjects left for the Member States to determine. In fact, a whole chapter of the GDPR[iii] is dedicated to these subjects, namely: Processing and freedom of expression and information (Article 85); Processing and freedom of expression and information (Article 86); Processing of the national identification number (Article 87); Processing in the context of employment (Article 88); Safeguards and derogations relating to processing for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes (Article 89); Obligations of secrecy (Article 90) and Existing data protection rules of churches and religious associations (Article 91).

Additionally, matters of procedural law, according to the Principle of Conferral (Article 5 of the Treaty on the European Union) are almost entirely left for Member States to regulate, with few exceptions such as the deadlines and the (in)formalities of the reply to a data subject rights request (Article 12) and, most notably, the one-stop shop procedure (instated in Article 60) and all its related and non-related issues that are undertaken by the European Data Protection Board, the new European Union Body provided by the GDPR (section 3 of Chapter VII).

The task that lied ahead of the Portuguese legislator, concerning the national reform of the Data Protection Law[iv], was therefore demanding but framed in a way that should have helped steer its drafting in a comprehensive and relatively straightforward manner[v].

The legislative procedure in Portugal took some time to be jumpstarted and it wasn’t until the 22nd of March 2018 that a proposal from the government was finally approved and forwarded to the Parliament, as this is a matter of its competence under Article 165(1)(b) of the Portuguese Constitution.
Continue reading “Editorial of December 2019”

Editorial of October 2019

13120237513_9273218bcd_o

 by Tamara Álvarez Robles, Lecturer at the University of Vigo


On the reform of national law on data protection: the special incorporation of digital rights in Spain

The reform of the Spanish Organic Law on Data Protection (LO 3/2018), to adapt it to the General Regulation of Data Protection has introduced together with the European requirements a catalogue of digital rights. Title X “Guarantee of digital rights” has meant, undoubtedly one of the biggest novelties to data protection regulations. It is composed of a set of Articles, from 79 to 97, which present, for the first time in the Spanish national legislative sphere, the new generation of digital rights[i], inter alia, right to Internet neutrality, right to digital security, right to digital education, protection of minors on the Internet, right to rectification on the Internet, right to privacy and use of digital devices in the workplace, right to digital disconnection in the workplace, right to digital testament.

The inclusion in-extremis of the present Title X, of digital rights, through amendment of the Congress of Deputies dated April 18, 2018, responds to the fundamental importance, to the ever-present and dominating reality of the Internet, which reaches all spheres of our lives. That is why, Organic Law 3/2018 in section IV of the Preamble already points to the involvement of public authorities through the provision of public policies (Article 9.2 SC) in order to make effective the catalogue of digital rights based on the Principle of Equality (Article 14 SC), stating that: “it is the responsibility of the public authorities to promote policies that make effective the rights of citizens on the Internet, promoting the equality of citizens and the groups in which they are integrated in order to possible the full exercise of fundamental rights in the digital reality”.
Continue reading “Editorial of October 2019”

Editorial of September 2019

eraser-507018_1280

 by Alessandra Silveira, Editor
 and Tiago Cabral, Master's student in EU Law at UMinho


Google v. CNIL: Is a new landmark judgment for personal data protection on the horizon?

1. In the 2014 landmark Judgment Google Spain (C-131/12), the Court of Justice of the European Union (hereinafter, “ECJ”) was called upon to answer the question of whether data subjects had the right to request that some (or all) search results referring to them are suppressed from a search engine’s results. In its decision, the ECJ clarified that search engines engage in data processing activities and recognised the data subject’s right to have certain results suppressed from the results (even if maintained on the original webpage).

2. This right encountered its legal basis on Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data (hereinafter, “Directive 95/46”) jointly with Articles 7 (respect for private and family life) and 8 (protection of personal data) of the Charter of Fundamental Rights of the European Union (hereinafter, “Charter”). In accordance with the Court’s decision, it can be exercised against search engines acting as data controllers (Google, Bing, Ask, amongst others) and does not depend on effective harm having befallen the data subject due to the inclusion of personal data in the search engine’s results. Data subject’s rights should override the economic rights of the data controller and the public’s interest in having access to the abovementioned information unless a pressing public interest in having access to the information is present.

3. Google Spain offered some clarity on a number of extremely relevant aspects such as: i) the [existence of] processing of personal data by search engines; ii) their status as data controllers under EU law; iii) the applicability of the EU’s data protection rules even if the undertaking is not headquartered in the Union; iv) the obligation of a search engine to suppress certain results containing personal data at the request of the data subject; v) the extension, range and (material) limits to the data subjects’ rights. The natural conclusion to arrive is that Google Spain granted European citizens the right to no longer be linked by name to a list of results displayed following a search made on the basis of said name.

4. What the judgment did not clarify, however, is the territorial scope of the right (i.e. where in the world does the connection have to be suppressed?). Is it a global obligation? European-wide? Only within the territory of a specific Member State? In 2018, the European Data Protection Board (hereinafter, “EDPB”) issued Guidelines on the territorial scope of the GDPR, but their focus is Article 3 of the legal instrument and therefore they offer no clarity on this issue (even if they did, they would not bind the ECJ).
Continue reading “Editorial of September 2019”

A short introduction to accountability in machine-learning algorithms under the GDPR

30212411048_96d9eea677_o

 by Andreia Oliveira, Master in EU Law (UMINHO)
 and Fernando Silva, Consulting coordinator - Portuguese Data  Protection National Commission

Artificial Intelligence (AI) can be defined as computer systems designed to solve a wide range of activities, that are “normally considered to require knowledge, perception, reasoning, learning, understanding and similar cognitive abilities” [1]. Having intelligent machines capable of imitating human’s actions, performances and activities seems to be the most common illustration about AI. One needs to recognise AI as being convoluted – thus, machine learning, big data and other terms as automatization must hold a seat when discussing AI.  Machine learning, for example, is defined as the ability of computer systems to improve their performance without explicitly programmed instructions: a system will be able to learn independently without human intervention [2]. To do this, machine learning develops new algorithms, different from the ones that were previously programmed, and includes them as new inputs it has acquired during the previous interactions.

The capabilities of machine learning may put privacy and data protection in jeopardy. Therefore, ascertaining liability would be inevitable and would imply the consideration of inter alia all plausible actors that can be called upon account.

Under the General Data Protection Regulation (GDPR), the principle of accountability is intrinsically linked to the principle of transparency. Transparency empowers data subjects to hold data controllers and processors accountable and to exercise control over their personal data. Accountability requires transparency of processing operations, however transparency does not constitute accountability [3]. On the contrary, transparency acts as an accountability’ helper – e.g. helping to avoid barriers, such as opacity.
Continue reading “A short introduction to accountability in machine-learning algorithms under the GDPR”