Editorial of September 2019

eraser-507018_1280

 by Alessandra Silveira, Editor
 and Tiago Cabral, Master's student in EU Law at UMinho


Google v. CNIL: Is a new landmark judgment for personal data protection on the horizon?

1. In the 2014 landmark Judgment Google Spain (C-131/12), the Court of Justice of the European Union (hereinafter, “ECJ”) was called upon to answer the question of whether data subjects had the right to request that some (or all) search results referring to them are suppressed from a search engine’s results. In its decision, the ECJ clarified that search engines engage in data processing activities and recognised the data subject’s right to have certain results suppressed from the results (even if maintained on the original webpage).

2. This right encountered its legal basis on Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data (hereinafter, “Directive 95/46”) jointly with Articles 7 (respect for private and family life) and 8 (protection of personal data) of the Charter of Fundamental Rights of the European Union (hereinafter, “Charter”). In accordance with the Court’s decision, it can be exercised against search engines acting as data controllers (Google, Bing, Ask, amongst others) and does not depend on effective harm having befallen the data subject due to the inclusion of personal data in the search engine’s results. Data subject’s rights should override the economic rights of the data controller and the public’s interest in having access to the abovementioned information unless a pressing public interest in having access to the information is present.

3. Google Spain offered some clarity on a number of extremely relevant aspects such as: i) the [existence of] processing of personal data by search engines; ii) their status as data controllers under EU law; iii) the applicability of the EU’s data protection rules even if the undertaking is not headquartered in the Union; iv) the obligation of a search engine to suppress certain results containing personal data at the request of the data subject; v) the extension, range and (material) limits to the data subjects’ rights. The natural conclusion to arrive is that Google Spain granted European citizens the right to no longer be linked by name to a list of results displayed following a search made on the basis of said name.

4. What the judgment did not clarify, however, is the territorial scope of the right (i.e. where in the world does the connection have to be suppressed?). Is it a global obligation? European-wide? Only within the territory of a specific Member State? In 2018, the European Data Protection Board (hereinafter, “EDPB”) issued Guidelines on the territorial scope of the GDPR, but their focus is Article 3 of the legal instrument and therefore they offer no clarity on this issue (even if they did, they would not bind the ECJ).
Continue reading “Editorial of September 2019”

Advertisements

A short introduction to accountability in machine-learning algorithms under the GDPR

30212411048_96d9eea677_o

 by Andreia Oliveira, Master in EU Law (UMINHO)
 and Fernando Silva, Consulting coordinator - Portuguese Data  Protection National Commission

Artificial Intelligence (AI) can be defined as computer systems designed to solve a wide range of activities, that are “normally considered to require knowledge, perception, reasoning, learning, understanding and similar cognitive abilities” [1]. Having intelligent machines capable of imitating human’s actions, performances and activities seems to be the most common illustration about AI. One needs to recognise AI as being convoluted – thus, machine learning, big data and other terms as automatization must hold a seat when discussing AI.  Machine learning, for example, is defined as the ability of computer systems to improve their performance without explicitly programmed instructions: a system will be able to learn independently without human intervention [2]. To do this, machine learning develops new algorithms, different from the ones that were previously programmed, and includes them as new inputs it has acquired during the previous interactions.

The capabilities of machine learning may put privacy and data protection in jeopardy. Therefore, ascertaining liability would be inevitable and would imply the consideration of inter alia all plausible actors that can be called upon account.

Under the General Data Protection Regulation (GDPR), the principle of accountability is intrinsically linked to the principle of transparency. Transparency empowers data subjects to hold data controllers and processors accountable and to exercise control over their personal data. Accountability requires transparency of processing operations, however transparency does not constitute accountability [3]. On the contrary, transparency acts as an accountability’ helper – e.g. helping to avoid barriers, such as opacity.
Continue reading “A short introduction to accountability in machine-learning algorithms under the GDPR”

Trends shaping AI in business and main changes in the legal landscape

web-3706562_960_720

 

by Ana Landeta, Director of the R+D+i Inst. at UDIMA
and Felipe Debasa, Director of the ONSSTKT21stC at URJC

Without a doubt and under the European Union policy context, “Artificial Intelligence (AI) has become an area of strategic importance and a key driver of economic development. It can bring solutions to many societal challenges from treating diseases to minimising the environmental impact of farming. However, socio-economic, legal and ethical impacts have to be carefully addressed”[i].

Accordingly, organizations are starting to make moves that act as building blocks for imminent change and transformation. With that in mind, Traci Gusher-Thomas[ii] has identified four trends that demonstrate how machine-learning is starting to bring real value to the workplace. It is stated that each of following four areas provides value to an organisation seeking to move forward with machine-learning and adds incremental value that can scale-up to be truly transformational.
Continue reading “Trends shaping AI in business and main changes in the legal landscape”

Internet, e-evidences and international cooperation: the challenge of different paradigms

hacking-2077124_960_720

 

by Bruno Calabrich, Federal circuit prosecutor (Brazil)


There is a crisis in the world today concerning e-evidences. Law enforcement authorities deeply need to access and analyze various kinds of electronic data for efficient investigations and criminal prosecutions. They need it not specifically for investigating and prosecuting so-called internet crimes: virtually any crime today can be committed via the internet; and even those which aren’t executed using the web, possibly can be elucidated by information stored on one or another node of the internet. The problem is that enforcement authorities not always, nor easily, can access these data[i], as the servers where they are stored are frequently located in a different country. Thus, international cooperation is frequently a barrier to overcome so that the e-evidence can be obtained in a valid and useful way. And, today, the differences around the world in the legal structures available for this task may not be helping a lot.

The most commonly known instruments for obtaining electronic data stored abroad are the MLATs – Mutual Legal Assistance Treaties –, agreements firmed between two countries for cooperating in exchanging information and evidences (not restricted to internet evidences) that will be used by authorities in investigations and formal accusations. The cooperation occurs from authority to authority, according to a bureaucratic procedure specified in each treaty, one requesting (where it’s needed) and the other (where it’s located) providing the data. But, in a fast-changing world, where crime and information are moving even faster, the MLATs are not showing to be the fastest and efficient way.  In Brazil, for instance, the percentage of success in the cooperation with the United States through its MLAT roughly reaches 20% of the cases. Brazil, US and other countries do not seem to be satisfied with that.
Continue reading “Internet, e-evidences and international cooperation: the challenge of different paradigms”

Editorial of January 2019

Property Intellectual Copyright Symbol Protection

 by Alexandre Veronese, Professor at University of Brasília


Article 13 and the vigilance dilemma

The first US battles about filtering

In light of the worldwide ongoing debate surrounding legal regimes over internet, in special the recent controversies on amendments proposals to applicable EU rules, such as Directive 96/9, Directive 2001/29 or Directive 2012/28, but most notably Article 13 of the (soon-to-be) Directive on Copyright in the Digital Single Market, it is of utmost importance to seek some perspective. The topic is relevant as much as complex with a range of aspects to consider. For instance, one of the approaches the EU is giving to the matter involves the use of internet (or digital tools in general) for new cultural purposes following the celebration in 2018 of the European Year of Cultural Heritage. In that regard, I had the opportunity to reflect upon this debate alongside Professor Alessandra Silveira, editor of the Blog of UNIO, and other colleagues in an excellent Portuguese podcast. In this post, I intend to shed some light in the global depth of the matter by analysing the American inaugural experience.

At the beginning of the widespread usage of the Internet, the United States society was immersed in a debate about how to deal with offensive content. In the 1990s, Internet had no boundaries and no firewalls to prevent the incoming waves of pornographic and unusual materials. Quickly, a political movement made a strong statement in order to protect American families from that threat. In 1996, the US Congress passed a bill named Communications Decency Act, also known as the CDA. The Bill was signed into Law by the former President Bill Clinton. The CDA was intended to provide an effective system to take down offensive content. Some of the founders of the Internet launched a campaign against the CDA. The now widely famous Electronic Frontier Foundation was the spearhead of the resistance. Until today, we remember the Declaration of Freedom in the Internet, which was written by John Perry Barlow. The major weapon of the resistance was the First Amendment of the US Constitution. Some lawsuits were filled and in a brief timespan the US Supreme Court took down the CDA for it was ruled as unconstitutional. The Supreme Court maintained the long-aged interpretation that the State must be out of any action to perform any possible kind of censorship (Reno v. ACLU, 1997).
Continue reading “Editorial of January 2019”

The US CLOUD Act and EU Law

37845654022_5f25c5d30d_o

 by Alexandre Veronese, Professor at University of Brasília

In March 2018, the President of the United States of America signed into Law a Bill approved by the Congress, which amended two parts of the US Code, the consolidation of the federal statutory norms of the country. The Clarifying Lawful Overseas Use of Data Act – CLOUD Act – was the third version of two preceding bills. Those prior bills tried to solve a grave contemporary issue: the difficulty to access electronic data that could be necessary to criminal investigations and prosecution. The new CLOUD Act changes mainly two passages of the US Code. It creates the possibility that the United States and foreign countries could sign executive agreements to grant mutual assistance in order to authorize the gathering of overseas data. In addition, the CLOUD Act creates standards to those agreements.

The United States of America have a long standing right to due process of law entrenched in the Fourth Amendment of its Constitution. The debate about the limits to access information captured by the means of new ways of communication is rather old in the US. The Federal Wiretap Act came to the US Code amidst the Omnibus Crime Control and Safe Streets Act of 1968. It was a huge alteration of the Title 18 of US Code, which is the Crimes and Criminal Procedures federal statutory law. Therefore, the federal statutory law received provisions that could regulate the lawful wiretapping in criminal investigations and the use of them between agencies and jurisdictions. Notwithstanding, the passing of time and the evolution of technologies showed the aging of those legal norms. A lot of the information that matters to seize, in order to archive effective evidence to use in investigations, came to be electronic. It was necessary to modify the Wiretap Act and, in 1986, it came the Electronic Communications Privacy Act. The new Act modernized the Law and it regulated the criminal features related to stored electronic information – the Stored Communications Act. The Patriot Act (2001 and 2006) brought to light some provisions regarding to overseas information that were made more detailed with the amendments signed into law in 2008.
Continue reading “The US CLOUD Act and EU Law”

Editorial of July 2018

artificial-intelligence-698122_960_720

 by Alessandra Silveira, Editor 
 and Sophie Perez Fernandes, Junior Editor


Artificial intelligence and fundamental rights: the problem of regulation aimed at avoiding algorithmic discrimination

The scandal involving Facebook and Cambridge Analytica (a private company for data analysis and strategic communication) raises, among others, the problem of regulating learning algorithms. And the problem lies above all in the fact that there is no necessary connection between intelligence and free will. Unlike human beings, algorithms do not have a will of their own, they serve the goals that are set for them. Though spectacular, artificial intelligence bears little resemblance to the mental processes of humans – as the Portuguese neuroscientist António Damásio, Professor at the University of Southern California, brilliantly explains[i]. To this extent, not all impacts of artificial intelligence are easily regulated or translated into legislation – and so traditional regulation might not work[ii].

In a study dedicated to explaining why data (including personal data) are at the basis of the Machine-Learning Revolution – and to what extent artificial intelligence is reconfiguring science, business, and politics – another Portuguese scientist, Pedro Domingos, Professor in the Department of Computer Science and Engineering at the University of Washington, explains that the problem that defines the digital age is the following: how do we find each other? This applies to both producers and consumers – who need to establish a connection before any transaction happens –, but also to anyone looking for a job or a romantic partner. Computers allowed the existence of the Internet – and the Internet created a flood of data and the problem of limitless choice. Now, machine learning uses this infinity of data to help solve the limitless choice problem. Netflix may have 100,000 DVD titles in stock, but if customers cannot find the ones they like, they will end up choosing the hits; so, Netflix uses a learning algorithm that identifies customer tastes and recommends DVDs. Simple as that, explains the Author[iii].
Continue reading “Editorial of July 2018”

The first steps of a revolution with a set date (25 May 2018): the “new” General Data Protection regime

regulation-3246979_1280

by Pedro Madeira Froufe, Editor


1. Homo digitalis[i] is increasingly more present in all of us. It surrounds us, it captures us. Our daily life is digitalising rapidly. We live, factually and considerably, a virtual existence… but very real! The real and the virtual merge in our normal life; the frontiers between these dimensions of our existence are bluring. Yet, this high-tech life of ours does not seem to be easily framed by law. Law has its own time – for now barely compatible with the speed of technologic developments. Besides, in face of new realities, it naturally hesitates in the pursuit of the value path (therefore, normative) to follow. We must give (its) time to law, without disregarding the growth of homo digitalis.

2. Well, today (25 May 2018) the enforcement of Regulation 2016/679 (GDPR) begins. Since 25 January 2012 (date of the presentation of the proposal for the Regulation) until now the problems with respect to the protection of fundamental rights – in particular the guarantee of personal data security (Article 8 CFREU) – have been progressively clearer as a result of the increase in the digital dimension of our lives. Definitely, the personal data became of economic importance that recently publicized media cases (for example, “Facebook vs. Cambridge Analytics”) underline. Its reuse for purposes other than those justifying its treatment, transaction and crossing, together with the development of the use of algorithms (so-called “artificial intelligence” techniques) have made it necessary to reinforce the uniform guarantees of citizens, owners of personal data, increasingly digitized.
Continue reading “The first steps of a revolution with a set date (25 May 2018): the “new” General Data Protection regime”

The ultimate guide(line) to DPIA’s

11484777313_9b3f7f8f67_o

by João Marques, member of the Portuguese Data Protection National Commission and member of CEDU

Although merely advisory in its nature, the Article 29 Working Party (WP 29) has been a major force in guaranteeing a minimum of consistency in the application of the Directive 95/46/CE, allowing member states’ public and private sectors to know what to expect from their supervisory authorities perspectives on various data protection subjects. Its independence has played a major role in the definition of its views and opinions, focusing on the fundamental rights at stake and delivering qualified feedback to the difficult issues it has faced.

The new European legal framework on data protection has produced a step forward on this regard by instituting a new formal EU Body – the European Data Protection Board – EDPB (Art. 68 of the General Data Protection Regulation – GDPR). This will represent a significant step forward in the European institutional landscape concerning data protection but it does not mean that the WP 29 is already dead and buried, quite the opposite.

As it is already known, the EDPB will have far reaching powers designed to guarantee consistency and effectiveness to the rules of the regulation across the EU. One of the said powers translates into the issuance of guidelines in several matters [Art. 70 (1)(d), (f), (g), (h), (i), (j), (k), (m) of the GDPR].

The problem is, of course, that this new EU Body will only exist from May 2018 onwards, leaving a gap of two years (from May 2016, when the regulation entered into force) to be filled by the current legal and institutional frameworks. As such the WP29 took it into its hands to materialize these particular tasks of the EDPB during this transitional phase, fully aware that the guidelines it may issue for the time being could still be rebutted by the EDPB members. Nevertheless this is a calculated risk as the members currently sitting in the WP 29 will almost certainly be the ones who’ll be sitting in the EDPB.

Continue reading “The ultimate guide(line) to DPIA’s”

Data Protection Officer according to GDPR

hacking-2077124_1920

by André Mendes Costa, masters student at University of Minho
 ▪

In an ever changing world of information technologies, privacy and data protection inevitably attracts considerable attention.

The Portuguese Data Protection Law and the EU Directive 95/46 will be soon replaced by a new European and National legal framework. In fact, the new General Data Protection Regulation (GDPR) alters profoundly the paradigm of the personal data protection legal regime. The 679/2016 Regulation (GDPR) is part of a new European community legislative package which also includes a directive that lays down the procedures for dealing with personal data by the competent authorities for the purposes of prevention, research, detection and prosecution of criminal offences or the execution of criminal penalties. The Regulation came into force on 25th May and establishes a vacancy period of 2 years, providing the necessary time for the public and private sectors to equip themselves to face the new regulatory demands.

This brief analysis concentrates on the post of the data protection officer (DPO), on his/her duties and competencies and on those entities who are responsible for his/her appointment.

In the new European legislation there is an important change of paradigm in the protection of personal data namely the suppression – with a few exceptions contained in the Regulation – of the requisite of pre notification to the National Commission of Data Protection (NCDP). This change assigns to the person responsible for the processing of data the onus of legal guarantor of his/her cases, thus fully observing the Regulation. In fact, in the cases where there is no prior notification to the competent authority (NCDP), the Regulation has found other forms of guarantying that the processing of personal data is legally protected by creating the post of a data protection officer (DPO).
Continue reading “Data Protection Officer according to GDPR”