A short introduction to accountability in machine-learning algorithms under the GDPR

30212411048_96d9eea677_o

 by Andreia Oliveira, Master in EU Law (UMINHO)
 and Fernando Silva, Consulting coordinator - Portuguese Data  Protection National Commission

Artificial Intelligence (AI) can be defined as computer systems designed to solve a wide range of activities, that are “normally considered to require knowledge, perception, reasoning, learning, understanding and similar cognitive abilities” [1]. Having intelligent machines capable of imitating human’s actions, performances and activities seems to be the most common illustration about AI. One needs to recognise AI as being convoluted – thus, machine learning, big data and other terms as automatization must hold a seat when discussing AI.  Machine learning, for example, is defined as the ability of computer systems to improve their performance without explicitly programmed instructions: a system will be able to learn independently without human intervention [2]. To do this, machine learning develops new algorithms, different from the ones that were previously programmed, and includes them as new inputs it has acquired during the previous interactions.

The capabilities of machine learning may put privacy and data protection in jeopardy. Therefore, ascertaining liability would be inevitable and would imply the consideration of inter alia all plausible actors that can be called upon account.

Under the General Data Protection Regulation (GDPR), the principle of accountability is intrinsically linked to the principle of transparency. Transparency empowers data subjects to hold data controllers and processors accountable and to exercise control over their personal data. Accountability requires transparency of processing operations, however transparency does not constitute accountability [3]. On the contrary, transparency acts as an accountability’ helper – e.g. helping to avoid barriers, such as opacity.
Continue reading “A short introduction to accountability in machine-learning algorithms under the GDPR”

The first steps of a revolution with a set date (25 May 2018): the “new” General Data Protection regime

regulation-3246979_1280

by Pedro Madeira Froufe, Editor


1. Homo digitalis[i] is increasingly more present in all of us. It surrounds us, it captures us. Our daily life is digitalising rapidly. We live, factually and considerably, a virtual existence… but very real! The real and the virtual merge in our normal life; the frontiers between these dimensions of our existence are bluring. Yet, this high-tech life of ours does not seem to be easily framed by law. Law has its own time – for now barely compatible with the speed of technologic developments. Besides, in face of new realities, it naturally hesitates in the pursuit of the value path (therefore, normative) to follow. We must give (its) time to law, without disregarding the growth of homo digitalis.

2. Well, today (25 May 2018) the enforcement of Regulation 2016/679 (GDPR) begins. Since 25 January 2012 (date of the presentation of the proposal for the Regulation) until now the problems with respect to the protection of fundamental rights – in particular the guarantee of personal data security (Article 8 CFREU) – have been progressively clearer as a result of the increase in the digital dimension of our lives. Definitely, the personal data became of economic importance that recently publicized media cases (for example, “Facebook vs. Cambridge Analytics”) underline. Its reuse for purposes other than those justifying its treatment, transaction and crossing, together with the development of the use of algorithms (so-called “artificial intelligence” techniques) have made it necessary to reinforce the uniform guarantees of citizens, owners of personal data, increasingly digitized.
Continue reading “The first steps of a revolution with a set date (25 May 2018): the “new” General Data Protection regime”

The ultimate guide(line) to DPIA’s

11484777313_9b3f7f8f67_o

by João Marques, member of the Portuguese Data Protection National Commission and member of CEDU

Although merely advisory in its nature, the Article 29 Working Party (WP 29) has been a major force in guaranteeing a minimum of consistency in the application of the Directive 95/46/CE, allowing member states’ public and private sectors to know what to expect from their supervisory authorities perspectives on various data protection subjects. Its independence has played a major role in the definition of its views and opinions, focusing on the fundamental rights at stake and delivering qualified feedback to the difficult issues it has faced.

The new European legal framework on data protection has produced a step forward on this regard by instituting a new formal EU Body – the European Data Protection Board – EDPB (Art. 68 of the General Data Protection Regulation – GDPR). This will represent a significant step forward in the European institutional landscape concerning data protection but it does not mean that the WP 29 is already dead and buried, quite the opposite.

As it is already known, the EDPB will have far reaching powers designed to guarantee consistency and effectiveness to the rules of the regulation across the EU. One of the said powers translates into the issuance of guidelines in several matters [Art. 70 (1)(d), (f), (g), (h), (i), (j), (k), (m) of the GDPR].

The problem is, of course, that this new EU Body will only exist from May 2018 onwards, leaving a gap of two years (from May 2016, when the regulation entered into force) to be filled by the current legal and institutional frameworks. As such the WP29 took it into its hands to materialize these particular tasks of the EDPB during this transitional phase, fully aware that the guidelines it may issue for the time being could still be rebutted by the EDPB members. Nevertheless this is a calculated risk as the members currently sitting in the WP 29 will almost certainly be the ones who’ll be sitting in the EDPB.

Continue reading “The ultimate guide(line) to DPIA’s”

Protecting our personal data in the 21st century: why the new EU legal framework matters

by Rita de Sousa Costa, law student at UMinho
and Tiago Sérgio Cabral, law student at UMinho

Most people do not have any idea how much the processing of their personal data affects their daily life. In today’s world, our e-mail has the ability to distinguish between important and unimportant e-mails based on our previous communications. When we want to read the news our phones and tablets are able to predict the events and sources that we would be interested in. Facebook knows more about our friends than we do. If you want to watch a movie, Netflix has a broad selection and may give you some tips based on your previously watched list, same with Youtube. If we have a favorite supermarket chain it probably knows what we like to buy through our customer cards. Our keyboards are able to predict the very words we will type[i].

We would find a rather different scenario if we looked to the world in 1995. Twenty years ago, the Internet was still in its early stages of development and was rather different from what we know and use today[ii]. E-mail and instant messaging were unknown to the general population. Google and search engines did not exist. Social networking and smartphones did, but only in science fiction movies. With this in mind, it is rather astonishing that the EU legal framework regarding the protection of personal data managed to stay, more or less, unchanged for more than twenty years. In these twenty years, the Directive 95/46/CE ensured the protection of personal data for EU citizens fulfilling the required by the article 16. of the TFUE and the article 8. of the EUCFR[iii]/[iv].

Continue reading “Protecting our personal data in the 21st century: why the new EU legal framework matters”