Artificial intelligence: 2020 A-level grades in the UK as an example of the challenges and risks

by Piedade Costa de Oliveira (Former official of the European Commission - Legal Service)
Disclaimer: The opinions expressed are purely personal and are the exclusive responsibility of the author. They do not reflect any position of the European Commission

The use of algorithms for automated decision-making, commonly referred to as Artificial Intelligence (AI), is becoming a reality in many fields of activity both in the private and public sectors.

It is common ground that AI raises considerable challenges not only for the area for which it is operated in but also for society as a whole. As pointed out by the European Commission in its White Paper on AI[i], AI entails a number of potential risks, such as opaque decision-making, gender-based bias or other kinds of discrimination or intrusion on privacy.

In order to mitigate such risks, Article 22 of the GDPR confers on data subjects the right not to be subject to a decision based solely on automated processing which produces legal effects concerning them or similarly significantly affects them[ii].

Continue reading “Artificial intelligence: 2020 A-level grades in the UK as an example of the challenges and risks”

The “mandatory” contact-tracing App “StayAway COVID” – a matter of European Union Law

by Alessandra Silveira, Joana Covelo de Abreu (Editors) and Tiago Sérgio Cabral (Managing Editor)

1. During the previous week there as been plenty of controversy regarding a proposal by the Portuguese Government to make the installation of the App “StayAway COVID” (“App”) – a mobile contact-tracing application designed to fight the pandemic – mandatory for large sections of the population. While the Government appears to have backed down from this idea (for now) the issue of European Union Law (“EU Law”) has been surprisingly absent from most of the debate around a measure of this nature, even though it should be front and centre and precedes even the issue of constitutionality.

As we will show in this text, it is difficult to argue against the conclusion that this subject should be considered as a matter of EU Law – and, consequently, that this is a question of fundamental rights protected by the European Union (“EU”). In the EU’s legal framework, privacy and personal data protection are fundamental rights enshrined within Article 16 of the Treaty on the Functioning of the EU and Articles 7 and 8 of the Charter of Fundamental Rights of the EU (CFREU). Since it is a matter regulated at EU level, the EU’s standard of fundamental rights’ protection is applicable before and above even the national constitutional standards of protection[i]. So, this is not just a Portuguese constitutional problem that can be solved in the light of the Portuguese Constitution – it is an issue of relevance to all European citizens which needs to be resolved in the light of the EU´s (jus)fundamental standards (see Article 51 CFREU).[ii] It is important to be aware that the Court of Justice of the EU (“ECJ”), in the past, struck down constitutional provisions from Member States to ensure the adequate protection of fundamental rights of privacy and personal data protection[iii]. This is because all Member States do not have the same level of (jus)fundamental protection.

2. Under the current legal framework in the EU, enforcing the use of any contact-tracing application to the general public (or to large sections of the general public such as the entire population inserted within the labour market, academia, schools and public administration) would always face some serious challenges.

Continue reading “The “mandatory” contact-tracing App “StayAway COVID” – a matter of European Union Law”

Editorial of April 2020

scope-microscope-camera-experiment-sience

by Alessandra Silveira, Editor


Health-related personal data – regarding COVID-19 and digital surveillance

Article 9 of the Regulation (EU) 2016/679 – General Data Protection Regulation (hereinafter, “GDPR”) prohibits the processing of special categories of personal data, amongst them (and the ones relevant for the subject of this essay): genetic data; biometric data for the purpose of uniquely identifying a natural person; and data concerning health. However, this prohibition shall not apply if processing is necessary for the purposes of medical diagnosis; the provision of health care or treatment;  the management of health care systems; or pursuant to contract with a health professional, in accordance to point h), of Article 9/2 of GDPR and under the further conditions established in Article 9/3. In particular, the general prohibition shall not apply if the “processing is necessary for reasons of public interest in the area of public health, such as protecting against serious cross-border threats to health or ensuring high standards of quality and safety of health care and of medicinal products or medical devices”, under point i), of Article 9/2.
Continue reading “Editorial of April 2020”

Editorial of December 2019

data-protection-regulation-3413077_1920

by João Marques, member of the Portuguese Data Protection National Commission


Portuguese DPA won’t apply the country’s GDPR law

In spite of its nature[i], the GDPR leaves some room of manoeuvre to the Member States. This European legal instrument has even been called a hybrid[ii] between a directive and a regulation, precisely because there is a significant amount of issues where national legislation can in fact diverge from the general solutions the GDPR brings to the table. Although such leeway is not to be misunderstood for a “carte blanche” to the Member States, there is nevertheless a relevant part to be played by national legislators.

From the definition of a minimum legal age for children’s consent to be considered valid for its personal data to be processed (in relation to information society services), which can vary between 13 and 16 years of age, to the waiver on fines being applied to the public sector (Article 83, 7), there is a vast array of subjects left for the Member States to determine. In fact, a whole chapter of the GDPR[iii] is dedicated to these subjects, namely: Processing and freedom of expression and information (Article 85); Processing and freedom of expression and information (Article 86); Processing of the national identification number (Article 87); Processing in the context of employment (Article 88); Safeguards and derogations relating to processing for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes (Article 89); Obligations of secrecy (Article 90) and Existing data protection rules of churches and religious associations (Article 91).

Additionally, matters of procedural law, according to the Principle of Conferral (Article 5 of the Treaty on the European Union) are almost entirely left for Member States to regulate, with few exceptions such as the deadlines and the (in)formalities of the reply to a data subject rights request (Article 12) and, most notably, the one-stop shop procedure (instated in Article 60) and all its related and non-related issues that are undertaken by the European Data Protection Board, the new European Union Body provided by the GDPR (section 3 of Chapter VII).

The task that lied ahead of the Portuguese legislator, concerning the national reform of the Data Protection Law[iv], was therefore demanding but framed in a way that should have helped steer its drafting in a comprehensive and relatively straightforward manner[v].

The legislative procedure in Portugal took some time to be jumpstarted and it wasn’t until the 22nd of March 2018 that a proposal from the government was finally approved and forwarded to the Parliament, as this is a matter of its competence under Article 165(1)(b) of the Portuguese Constitution.
Continue reading “Editorial of December 2019”

Editorial of October 2019

13120237513_9273218bcd_o

 by Tamara Álvarez Robles, Lecturer at the University of Vigo


On the reform of national law on data protection: the special incorporation of digital rights in Spain

The reform of the Spanish Organic Law on Data Protection (LO 3/2018), to adapt it to the General Regulation of Data Protection has introduced together with the European requirements a catalogue of digital rights. Title X “Guarantee of digital rights” has meant, undoubtedly one of the biggest novelties to data protection regulations. It is composed of a set of Articles, from 79 to 97, which present, for the first time in the Spanish national legislative sphere, the new generation of digital rights[i], inter alia, right to Internet neutrality, right to digital security, right to digital education, protection of minors on the Internet, right to rectification on the Internet, right to privacy and use of digital devices in the workplace, right to digital disconnection in the workplace, right to digital testament.

The inclusion in-extremis of the present Title X, of digital rights, through amendment of the Congress of Deputies dated April 18, 2018, responds to the fundamental importance, to the ever-present and dominating reality of the Internet, which reaches all spheres of our lives. That is why, Organic Law 3/2018 in section IV of the Preamble already points to the involvement of public authorities through the provision of public policies (Article 9.2 SC) in order to make effective the catalogue of digital rights based on the Principle of Equality (Article 14 SC), stating that: “it is the responsibility of the public authorities to promote policies that make effective the rights of citizens on the Internet, promoting the equality of citizens and the groups in which they are integrated in order to possible the full exercise of fundamental rights in the digital reality”.
Continue reading “Editorial of October 2019”

Editorial of September 2019

eraser-507018_1280

 by Alessandra Silveira, Editor
 and Tiago Cabral, Master's student in EU Law at UMinho


Google v. CNIL: Is a new landmark judgment for personal data protection on the horizon?

1. In the 2014 landmark Judgment Google Spain (C-131/12), the Court of Justice of the European Union (hereinafter, “ECJ”) was called upon to answer the question of whether data subjects had the right to request that some (or all) search results referring to them are suppressed from a search engine’s results. In its decision, the ECJ clarified that search engines engage in data processing activities and recognised the data subject’s right to have certain results suppressed from the results (even if maintained on the original webpage).

2. This right encountered its legal basis on Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data (hereinafter, “Directive 95/46”) jointly with Articles 7 (respect for private and family life) and 8 (protection of personal data) of the Charter of Fundamental Rights of the European Union (hereinafter, “Charter”). In accordance with the Court’s decision, it can be exercised against search engines acting as data controllers (Google, Bing, Ask, amongst others) and does not depend on effective harm having befallen the data subject due to the inclusion of personal data in the search engine’s results. Data subject’s rights should override the economic rights of the data controller and the public’s interest in having access to the abovementioned information unless a pressing public interest in having access to the information is present.

3. Google Spain offered some clarity on a number of extremely relevant aspects such as: i) the [existence of] processing of personal data by search engines; ii) their status as data controllers under EU law; iii) the applicability of the EU’s data protection rules even if the undertaking is not headquartered in the Union; iv) the obligation of a search engine to suppress certain results containing personal data at the request of the data subject; v) the extension, range and (material) limits to the data subjects’ rights. The natural conclusion to arrive is that Google Spain granted European citizens the right to no longer be linked by name to a list of results displayed following a search made on the basis of said name.

4. What the judgment did not clarify, however, is the territorial scope of the right (i.e. where in the world does the connection have to be suppressed?). Is it a global obligation? European-wide? Only within the territory of a specific Member State? In 2018, the European Data Protection Board (hereinafter, “EDPB”) issued Guidelines on the territorial scope of the GDPR, but their focus is Article 3 of the legal instrument and therefore they offer no clarity on this issue (even if they did, they would not bind the ECJ).
Continue reading “Editorial of September 2019”

A short introduction to accountability in machine-learning algorithms under the GDPR

30212411048_96d9eea677_o

 by Andreia Oliveira, Master in EU Law (UMINHO)
 and Fernando Silva, Consulting coordinator - Portuguese Data  Protection National Commission

Artificial Intelligence (AI) can be defined as computer systems designed to solve a wide range of activities, that are “normally considered to require knowledge, perception, reasoning, learning, understanding and similar cognitive abilities” [1]. Having intelligent machines capable of imitating human’s actions, performances and activities seems to be the most common illustration about AI. One needs to recognise AI as being convoluted – thus, machine learning, big data and other terms as automatization must hold a seat when discussing AI.  Machine learning, for example, is defined as the ability of computer systems to improve their performance without explicitly programmed instructions: a system will be able to learn independently without human intervention [2]. To do this, machine learning develops new algorithms, different from the ones that were previously programmed, and includes them as new inputs it has acquired during the previous interactions.

The capabilities of machine learning may put privacy and data protection in jeopardy. Therefore, ascertaining liability would be inevitable and would imply the consideration of inter alia all plausible actors that can be called upon account.

Under the General Data Protection Regulation (GDPR), the principle of accountability is intrinsically linked to the principle of transparency. Transparency empowers data subjects to hold data controllers and processors accountable and to exercise control over their personal data. Accountability requires transparency of processing operations, however transparency does not constitute accountability [3]. On the contrary, transparency acts as an accountability’ helper – e.g. helping to avoid barriers, such as opacity.
Continue reading “A short introduction to accountability in machine-learning algorithms under the GDPR”

Trends shaping AI in business and main changes in the legal landscape

web-3706562_960_720

 

by Ana Landeta, Director of the R+D+i Inst. at UDIMA
and Felipe Debasa, Director of the ONSSTKT21stC at URJC

Without a doubt and under the European Union policy context, “Artificial Intelligence (AI) has become an area of strategic importance and a key driver of economic development. It can bring solutions to many societal challenges from treating diseases to minimising the environmental impact of farming. However, socio-economic, legal and ethical impacts have to be carefully addressed”[i].

Accordingly, organizations are starting to make moves that act as building blocks for imminent change and transformation. With that in mind, Traci Gusher-Thomas[ii] has identified four trends that demonstrate how machine-learning is starting to bring real value to the workplace. It is stated that each of following four areas provides value to an organisation seeking to move forward with machine-learning and adds incremental value that can scale-up to be truly transformational.
Continue reading “Trends shaping AI in business and main changes in the legal landscape”

Internet, e-evidences and international cooperation: the challenge of different paradigms

hacking-2077124_960_720

 

by Bruno Calabrich, Federal circuit prosecutor (Brazil)


There is a crisis in the world today concerning e-evidences. Law enforcement authorities deeply need to access and analyze various kinds of electronic data for efficient investigations and criminal prosecutions. They need it not specifically for investigating and prosecuting so-called internet crimes: virtually any crime today can be committed via the internet; and even those which aren’t executed using the web, possibly can be elucidated by information stored on one or another node of the internet. The problem is that enforcement authorities not always, nor easily, can access these data[i], as the servers where they are stored are frequently located in a different country. Thus, international cooperation is frequently a barrier to overcome so that the e-evidence can be obtained in a valid and useful way. And, today, the differences around the world in the legal structures available for this task may not be helping a lot.

The most commonly known instruments for obtaining electronic data stored abroad are the MLATs – Mutual Legal Assistance Treaties –, agreements firmed between two countries for cooperating in exchanging information and evidences (not restricted to internet evidences) that will be used by authorities in investigations and formal accusations. The cooperation occurs from authority to authority, according to a bureaucratic procedure specified in each treaty, one requesting (where it’s needed) and the other (where it’s located) providing the data. But, in a fast-changing world, where crime and information are moving even faster, the MLATs are not showing to be the fastest and efficient way.  In Brazil, for instance, the percentage of success in the cooperation with the United States through its MLAT roughly reaches 20% of the cases. Brazil, US and other countries do not seem to be satisfied with that.
Continue reading “Internet, e-evidences and international cooperation: the challenge of different paradigms”

Editorial of January 2019

Property Intellectual Copyright Symbol Protection

 by Alexandre Veronese, Professor at University of Brasília


Article 13 and the vigilance dilemma

The first US battles about filtering

In light of the worldwide ongoing debate surrounding legal regimes over internet, in special the recent controversies on amendments proposals to applicable EU rules, such as Directive 96/9, Directive 2001/29 or Directive 2012/28, but most notably Article 13 of the (soon-to-be) Directive on Copyright in the Digital Single Market, it is of utmost importance to seek some perspective. The topic is relevant as much as complex with a range of aspects to consider. For instance, one of the approaches the EU is giving to the matter involves the use of internet (or digital tools in general) for new cultural purposes following the celebration in 2018 of the European Year of Cultural Heritage. In that regard, I had the opportunity to reflect upon this debate alongside Professor Alessandra Silveira, editor of the Blog of UNIO, and other colleagues in an excellent Portuguese podcast. In this post, I intend to shed some light in the global depth of the matter by analysing the American inaugural experience.

At the beginning of the widespread usage of the Internet, the United States society was immersed in a debate about how to deal with offensive content. In the 1990s, Internet had no boundaries and no firewalls to prevent the incoming waves of pornographic and unusual materials. Quickly, a political movement made a strong statement in order to protect American families from that threat. In 1996, the US Congress passed a bill named Communications Decency Act, also known as the CDA. The Bill was signed into Law by the former President Bill Clinton. The CDA was intended to provide an effective system to take down offensive content. Some of the founders of the Internet launched a campaign against the CDA. The now widely famous Electronic Frontier Foundation was the spearhead of the resistance. Until today, we remember the Declaration of Freedom in the Internet, which was written by John Perry Barlow. The major weapon of the resistance was the First Amendment of the US Constitution. Some lawsuits were filled and in a brief timespan the US Supreme Court took down the CDA for it was ruled as unconstitutional. The Supreme Court maintained the long-aged interpretation that the State must be out of any action to perform any possible kind of censorship (Reno v. ACLU, 1997).
Continue reading “Editorial of January 2019”