Protection of workers’ personal data: General principles
Abstract
This working paper gives an overview of legal standards related to personal data protection. It explores trends, principles and good practices and brings them in relation to the world of work. The aim of this working paper is to give a global and updated outlook of the leading and basic legal principles and standards in this area. The focus is on data protection principles which have a general nature and which can be embedded in a global approach. This working paper attempts to expose and clarify general data protection principles, having in mind that these principles are applicable in the context of the evolving employment relationship, taking into account technological evolutions. An understanding of general data protection principles is considered necessary to comprehend their application in the work environment and to build further towards principles that relate to more specific areas and problem fields.
Introduction
Aim and scope
This study on “the legal protection of workers' personal data” aims to develop the knowledge base on this subject in light of legal and technological developments.
The purpose of this working paper is to give an overview of legal standards related to workers’ personal data protection and, from both a global and regional point of view, to explore trends, principles and good practices.
The legal landscape in the field of personal data protection has strongly evolved over time. Many global and regional initiatives have led to standards on personal data protection. The European Union came with a ‘General Data Protection Regulation’ (GDPR) in 2016. Twenty years earlier, the ILO adopted its Code of Practice on the protection of workers' personal data.1 Other forms of standards developed both within regional organisations as well as in many countries around the world.
The aim of this working paper is to give a global and updated outlook of the main and basic principles in this area, taking into account legal sources and principles from a comparative perspective. In light of this, the focus will be on data protection principles which have a
With this approach in mind, this working paper has limitations. The relevance and importance of specific areas of workers’ personal data protection should not be neglected. However, with a view to limit this working paper to basic principles, more specific or complementary principles related to electronic or digital monitoring and surveillance, or specific rules in relation to health data, will not be fully elaborated. Based on the research undertaken for this work paper, the finding is not only that these specific areas are extremely important for the update of the international knowledge base, but the suggestion is that this requires an in-depth follow-up study in its own worth.
A similar approach is undertaken with regard to the discussion on artificial intelligence, robotisation and similar forms of automation. These developments bring new regulatory challenges for the world of work and for data protection law. While this area is sufficiently vast and specific to deserve a separate in-depth study, this report will, within its limits, remain sufficiently sensitive to capture the legal data protection issues arising from them.
Methodology
This study uses a multiple set of legal research methods and sources, mainly involving desk research, including library searches and official websites (specific attention to government departments for justice, labour and data protection), applying key-terms based on the study outline. This involves available and relevant sources of regulation, including legislation, governmental decrees, collective agreements, recommendations, case law, as well as legal scholarship.
In addition, the study used an expert questionnaire with a view to receive input from experts around the world in the cross-sectional field of employment and data protection law. It delivered more precise country related information or clarified selected information. The questionnaire is available with the author of this working paper. The involved expert respondents have been mentioned in the acknowledgment.
Structure
The structure of the report follows the logic of the aims and scope of the working paper. The working paper departs from a main introduction, with a chapter providing a brief setting of the scene, with references to the legal evolution of the privacy and data concepts and standards. A specific chapter discusses the global and regional standard setting in the field of data protection, paying attention to both the global and regional outlook, but also to their mutual influence, their general perspectives as well as, for some, their specific relation to the work context. The issues of AI and ‘Industry 4.0.’ developments are mentioned in a specific chapter, seen the various and complex challenges in relation to data protection standards and in light of the evolvement of AI related standard setting initiatives.
Understanding general principles
This working paper relies on legally relevant and grounded principles of personal data protection. However, before entering into the relevant global and legal sources in relation to general principles, it is proper to pay attention to their context and to indicate important frameworks of understanding.
1.1. Privacy and data protection
This working paper considers (the right to) personal data protection. The focus of analysis on data protection law follows a logic. The employment environment is a main source of data processing. Contractual obligations, personnel administration and human resources are legitimate motives for – and areas of – data collection and processing. However, an approach to personal data protection needs the broader horizon of the right to privacy. The need for this embedded context has different reasons.
The first reason is that privacy and data protection are overlapping and strongly interdependent legal notions. It is accepted that the right to privacy covers the right to data protection, even outside automatic or semi-automatic data processing.2 The European Human Rights Court’s (ECtHR) case law is an example where rules on data protection have been conceived under the concept of the right to privacy.3 South Africa is another good example of this broader contextual approach. In this country, the right to personal data protection is seen as a derivative of the constitutional right to privacy.4 Also other regional perspectives, such as the Indian approach, show that the right to privacy is construed as protection of personal data.5
The second reason is that privacy is a reinforcing notion for personal data protection. The right to privacy was defined in 1890 by Warren and Brandeis as “the right to be let alone”.6 Since then, privacy protection strongly evolved over time. It increasingly provided important and various ways of protection in the employment context. It not only covers ‘private life’, or the ‘life away from the public’, but a much wider field, including the more ‘public’ context as well as the workplace and human interaction in a work context. The privacy notion has appeared as flexible, responsive and adaptive to new circumstances. The right to privacy fits with a human-in-command approach, which shows its relevance in discussions on AI and robotisation.7 In this respect, the report of the Commission on the Future of Work confirms the link between privacy risks, the generation of “large amounts of data on workers” as well as “algorithmic accountability in the world of work”.8
A third reason, related with the open-textured and responsiveness of the right to privacy, is its connection to the societal and economic context in which that right is promoted or protected. While the right to privacy, and data protection, is universally accepted as a human right, its understanding remains connected with a social and cultural, even politico-historical bind.9 It may mean that international level principles may need to take into account possible different jurisdictional approaches, though with a common baseline.10
1.2. The employment context
In this study, we aim to take both the legal nature of privacy and data protection as well as the specific context of the employment relationship into account. Based on an international and regional human rights perspective, workers have a right to privacy. However, this legal departure point needs some further qualification for the employment relationship.
An employment relationship implies, in a widely applied view, a relation of subordination or dependency.11 This implies that the worker’s personal freedom is limited to the extent that the employer is in principle entitled to manage and direct the work and thus to have a say over its workers’ personal behaviour, to obtain information and to control and discipline workers. Whereas the right to privacy encompasses the ‘right to be let alone’, the employment relationship gives the employer the ‘right not to leave alone’ its workers.
The right to privacy and data protection is, in principle, guaranteed for all workers, regardless of the type of employment relationship. The employment relationship must, therefore, also be envisaged in its most modern forms and privacy and data protection considerations must be taken into account in different contexts of employment. The issues of privacy and data protection are, for example, increasingly challenged in the context of ‘place and time independent’ forms of employment, (such as telework)12 or in the platform economy. As the analysis will take into account these diverse contexts of work, it may form a stepping stone towards the effective recognition that “
The European Working Party,14 in its Opinion N° 2/2017, confirms this broad approach towards the application of the EU’s GDPR:
-
“Where the word “employee” is used in this Opinion, WP29
does not intend to restrict the scope of this term merely to persons with an employment contract recognized as such under applicable labour laws. Over the past decades, new business models served by different types of labour relationships, and in particular employment on a freelance basis, have become more commonplace. This Opinion is intended to cover all situations where there is an employment relationship, regardless of whether this relationship is based on an employment contract.”15
In addition to this, it can be pointed out that rights of data subjects are able to be respected by employers, but the provisions also imply respect by other parties, such as governments, HR providers, colleagues, subcontractors, workers’ organisations, and so on.
Beyond the employment relationship, an employment context involves a variety of rights and interests which are broader than purely those of the contracting parties. Legitimate interests of colleagues, clients or the wider public may exercise an influence on the way how the right to privacy or the right to data protection are approached and they may also limit the exercise of this right in a work context. The worker’s right to privacy and data protection is therefore qualified by the employment relationship.16 This will require that the opposition of rights will be part of the discussion, requiring a reconciliation of rights and interests in the employment context. In an employment privacy discourse, reasonable privacy expectations have to be taken into consideration.
1.3. Role of technology
The world of work is confronted with an increasing attention for privacy and data protection. The role of technological development is apparent. Automation and new technologies have not only challenged the world of work, they also have influenced the evolution of the right to privacy and data protection. More recent developments and technological evolutions, such as digitalisation, big data, the internet of things, artificial intelligence and robotisation are affecting the world of work in such a way that attention to privacy and data protection grows with an increased pace and relevance. It has been argued that the dynamic privacy concept has adapted itself over time on the pace of new technological challenges, and the idea of “privacy 4.0”17 has been seen as a response to ‘industry 4.0.’ and other disruptive models that attempt to explain the complex future of the world of work.18 It not only marks the right to privacy as a ‘layered’ concept, it also confirms its technological responsiveness in a broader perspective.
Various (international) regulators have started to address the issue of personal data protection since the 1980’s and 1990’s, as will be shown below. Data protection laws and principles have originated in a time that computerisation and electronic databases stood central. During the last decennia of the 20th Century, technological evolutions not only brought computers, but also the internet, with enormous potential of data processing and new electronic communication possibilities. Some of the data protection principles of today still resonate this background, although in a modernised context.
With the paradigm of data protection law evolving over time, there is still room for new evolutions. Increased attention for data protection has come in a world concerned about increased and ultimately unlimited possibilities of data processing, the centralisation and interconnection of data, the fast flow of data and disclosure of information, the ease of manipulation, information asymmetry, and so on. While the ‘big brother’ metaphor has often been used, it may be noted that data protection problems of the future need to take into account additional approaches, for example as a problem of vulnerability, powerlessness and dehumanisation.19
1.4. Changing work-life relations
Another crucial dimension for the future approach of privacy and data protection law in the employment context is the fast evolving and changing world of work and its impact on the work-life context. This goes along with technological innovation allowing new ways of working. A few developments have increased a blurring of boundaries between the sphere of work and the sphere of private life. They lead to a new ‘privatisation’ of the workplace.
A first evolution is the rise of digital and online communication, such as e-mail and internet. In the employment context, issues have arisen with regard to workers using professional communication systems for personal reasons or bringing personal communication systems into the workplace (including because of ‘bring your own device’ policies). This mix of private and professional communication not only causes discussion with regard to the monitoring of communication but also to the limits to the (personal) use of communication means in an employment context.
A second development concerns the rise of the virtual workplace. For a number of years, the workplace is gradually shifting towards a more digital workplace. It has come with a new way of looking at the employment relationship, allowing for work to be organised and performed digitally and sometimes in a more autonomous way on the side of the worker. It has given rise to a fast growth of home work in the form of telework, with new challenges for work-life boundary management, as confirmed by an ILO/Eurofound study.20
A third element is the impact of the use of social media on the work context or the employment relationship. Not only have businesses shown an interest in online recruiting and social media presence. The impact of the use of social media by workers on the employment relationship creates additional tensions and new forms of work-private life interactions.
1.5. New privacy expectations
Along with the technological (r)evolution, work relations have to be understood, more than ever, in terms of privacy relations.21 The environment of work has developed in such a way, that it becomes hard to escape from various forms of data processing, monitoring, tracking or generally connecting to the digital world. Furthermore, the idea that the workplace delivers a relatively simple context of employers supervising workers, has to be left behind. Not only do management and supervision of work (and workers) come in new digital ways, it may also be in the hands of different actors and even with a complex of automated systems beyond individual or even the employer’s full control. The impact on privacy expectations is thus potentially significant. However, at the same time, legal developments towards privacy protection, in light of privacy expectations, are also broadening privacy protection in the work environment.
Since the right to privacy and data protection is to be seen as a fundamental human right, it is protected as such in many instruments and constitutions around the world. This ‘fundamental’ character brings the rights discourse to the level that justifications of privacy interferences should be dealt with along the lines of human rights protection mechanisms. As the employment context will bring a variety of opposing rights and interests to the right to privacy and data protection, principles of human rights protection will be relevant. An interference or limitation of a fundamental right can be expected to at least require the respect of principles including legitimacy, lawfulness, transparency and proportionality.22
However, specific issues of limitations arise in the employment context, due to the increasingly relevant presence of reasonable expectations of privacy. The employment relationship is a context in which the worker’s personal freedom and privacy are almost per definition exposed. This does not mean, self-evidently, that a worker cannot have a right to privacy. However, in an (employment) context, reasonable privacy expectations have to be taken into consideration. This concept arises on the horizon due to developments in (North) American as well as European legal systems.23
Global sources development
The search for general principles of personal data protection in a global perspective requires a focus on their relevance to the work environment, whereby both international, regional and country perspectives and sources need to be taken into account. The legal source framework will be developed, taking into account the global dynamics in international, regional and national instruments with regard to the right to privacy and data protection and the common grounds on which they may be based.
2.1. Global perspective
As mentioned before, the right to privacy and data protection is a result from a complexity of developments, including technological evolution, but also socio-political as well as legal change. It makes privacy approaches partly universally, partly culturally driven. Whereas privacy and data protection, as notions, have first arisen in Western legal systems, they have evolved throughout the globe. The notion of privacy appeared first in the U.S. legal system, but it has later on been conceptually imported, elaborated and adapted in Europe, where approaches on human dignity and personality rights pre-existed.24
While the right to privacy further developed in the North American legal system, European jurisdictions became strongly influenced by the case law of the European Court on Human Rights, under the European Convention on Human Rights (1950), which steadily expanded the number of issues as well as the privacy concept itself.
One of the first international organisations to take up the lead in the increasing regulatory attention to data processing and the concerns of privacy protection was the OECD. The OECD guidelines (see section 3.3) were strongly based on the American ‘FIPPS’,25 but they were the first international legal instrument in the field.
During the decennia that followed, regulation of data protection gained momentum, mainly in Europe. The Council of Europe followed up with the adoption of Convention 108 with regard to personal data protection on 28 January 1981.26 As national European responses were somewhat diffused, and seen the growing impact of the rising digital society, the European Union took the initiative to adopt legislation in 1995. The new millennium, with various challenges, including technological evolution, brought the European Union to modernise its legislation with the adoption of the General Data Protection Regulation (GDPR) in 2016.
In the meantime, the development of the right to privacy and data protection slightly evolved in other regions of the world. Yet data protection laws have been a more recent phenomenon in other parts of the world, such as in Asian and African countries, or within the broad Latin America and Pacific region. While member States of the OECD like Australia already had data protection legislation since the eighties, most Central, Southeast and East Asian countries, would only come rather recently with regulatory interventions.27 Many African countries have more recently drafted data protection legislation, or are in the process of making it.28 The initiatives these other parts of the world also brought new driving forces in transnational cooperation on standard-setting in the field. As will be explained below, various regional initiatives have been taken in Africa, Asia and Latin America. It should be noted, however, that many of these initiatives either resonate, reflect or take the model of the OECD or European data protection standards.
Against the background of these dynamics and global legal development, the European Union legislation, and mainly the GDPR, has been influencing data protection legislation around the world. It often stands as an example or benchmark for new data protection legislation.29
While the influence of the European standard setting model seems to attract attention, this study will inevitably relate to the latest developments in the various parts and regions of the world, knowing that European legal sources related to privacy and data protection give a strong and important benchmark.
At the same time, the main international instrument with a focus on data protection in the employment context is the ILO Code of Practice (1996). This document remains a central reference of data protection principles and will be referred to and used throughout this study.
2.2. Human rights sources
The human rights dimension of the issue of privacy and data protection cannot be overlooked. Various international documents make reference to it. The right to privacy is guaranteed by Article 12 of the Universal Declaration of Human Rights and Article 17 of the International Covenant on Civil and Political Rights.
Most regions in the world would now also recognize the right to privacy and/or data protection. The most important
Article 11 of the
2.3. Sources on data protection
Different international or regional organisations have addressed the right to data protection. The various initiatives give shape to general data protection principles. Furthermore, taking into account the need to give more guidance in this regard, data protection in the employment relationship has received specific attention.
a. UN instruments
The United Nations do not have a specific standard with regard to data protection, although it follows the subject very closely and has taken various initiatives. On 14 December 1990, the UN General Assembly adopted the
b. OECD
The OECD has been one of the first organisations to respond to the increase of automated data processing and the concern to address the issue of data protection with an international instrument. On 23 September 1980, the OECD adopted a
c. Europe
The European legal order has to be seen both from the perspective of the European Union as well as the Council of Europe.
The origins of EU data protection legislation can be found in the Data Protection Directive 95/46/EC on 24 October 1995,39 with which the EU created a major legal instrument on the subject. In 2012, the European Commission took the initiative to reform the data protection legislation, taking into account considerations of new technological developments and the effective exercise of rights.40 This led to the adoption of the ‘
The Council of Europe adopted
d. Asia-Pacific
In the Asia-Pacific region, different initiatives with regard to privacy and data protection have been taken.
On 16 November 2016, an
APEC also adopted the “
For the APEC region, it is relevant to mention the Asia Pacific Privacy Authorities (APPA), a forum for privacy authorities in the Asia Pacific region.52
It gives privacy authorities in the region an opportunity to form partnerships, discuss best practices and share information on emerging technology, trends and changes to privacy regulation.53
e. Africa
The legal notions of privacy and data protection have gradually shown up in the African region. While the concept of privacy may be rather new to the cultural and legal traditions of African countries, the value of a regulatory approach has increased over the years. Different African countries have taken initiatives of regulating data protection and various African constitutions have adopted a right to privacy.54 These legislative initiatives have been promoted, supported and underpinned by African regional initiatives, partly in response to a need for benchmarking and harmonisation.
An important initiative came from ECOWAS, the intergovernmental organisation of Western African countries. The “
The African Union (AU), covering 55 African states, followed with a wider African initiative. It adopted in 2014 the
Another relevant African regional document is the revised
f. Latin America and the Caribbean
Latin American countries are slowly coming to a regional development of common standards on privacy and data protection. Countries in this region have been working on data protection law reforms. Some of the major national reforms have been inspired by, or modelled on, the European GDPR, such as the cases of Argentina and Brazil, with a number of countries, such as Chile, Mexico and Uruguay going in the same direction.61
Within the
2.4. Sources on data protection in the work environment
While general data protection instruments are in principle applicable in the employment context, some initiatives have been taken to create more specific guidance for the work environment. Three main organisations have to be mentioned.
a. ILO
Due to the need to develop data protection principles that specifically address the use of workers' personal data, the ILO published a Code of Practice concerning the protection of workers' personal data adopted in a Meeting of Experts on Workers' Privacy of the ILO in 1996.66 The Preamble of the Code points out that the purpose is to provide guidance on the protection of workers' personal data. The instrument was not adopted as a Convention or Recommendation. It is not designed to replace national laws, regulations, or international labour standards or other accepted standards, but should be used in the development of legislation, regulations, collective bargaining agreements, work regulations, policies and other practical measures.
b. EU
The EU made attempts to legislate in the area of employment data protection. Based on comparative work67, the European Commission initiated the consultation process under the Treaty’s social policy title with the social partners on this subject. The initiative, however, ultimately did not succeed.68
Under the (former) 1995 European Data Protection Directive, the European ‘Data Protection Working Party’69 adopted some guidance on data protection in the employment context. The Working Party adopted Opinion 8/2001 of 13 September 2001 on the processing of personal data in the employment context.70 Another instrument is the EU Working Document of 29 May 2002 on workplace communications.71 On 8 June 2017, the Working Party issued Opinion 2/2017 on data processing at work (WP Opinion 2/2017).72 Under the GDPR and its new governance model, the Working Party was replaced by the ‘European Data Protection Board’. The opinions of the Working Party nevertheless remain relevant and valid.
c. Council of Europe
The desirability of adapting these data protection principles to the particular requirements of the employment sector led to the adoption of Recommendation No. R(89)2 on the Protection of Personal Data Used for Employment Purposes. This Recommendation was adopted by the Committee of Ministers on 18 January 1989. On 1 April 2015, the Committee of Ministers adopted a new Recommendation on the processing of personal data in the employment context (CM/Rec(2015)5),73 motivated due to changes in the employment context and new technologies.74
Data protection principles
3.1. Introduction
As the world-wide growth of data protection regulation is unstoppable and a logical follow-up from technological evolutions and societal needs, it is clear that, mainly, regional data instruments attempt to give their respective responses. It has been indicated above, however, that with the creation of different regional instruments, mutual inspiration and benchmarking has taken place. This implies that, while the different instruments have their own language and choices, it is important to recognise and appreciate their global dimension and the common elements of data protection standards.
Only few studies have been conducted that compare and synthesise major data protection standards on a global scale. A leading study came in 2020 from the
-
Fairness: personal data should be processed fairly (with links to non-discrimination, transparency, absence of fraud). -
Legitimacy: (also known as lawfulness) personal data should be processed for legitimate purposes, or should be processed lawfully. -
Purpose specification: personal data should be processed only for specified, defined, explicit and legitimate purposes. -
Proportionality: personal data should be processed taking into account general requirements of proportionality, data minimisation requirements, requirements of non-excessive processing, or requirements of relevance to purpose. -
Data quality: personal data should be accurate, complete and up to date. -
Openness/transparency: the inclusion of some degree of openness or transparency can be found in all frameworks. Degrees range from general requirements to have transparent policies, and to ensure information about personal data processing is made available, to specific lists of information that must be provided directly to data subjects. -
Security: there should be appropriate (or sufficient) measures to secure personal data (processing). -
Data retention: personal -
Accountability: (a slightly less generally shared principle, with six out of ten frameworks) requiring that data controllers (and where applicable, processors) are accountable for the personal data they process. -
Access: data subjects have the right of access to their personal data and have these data rectified and/or deleted or erased, with (for some instruments) additional guarantees of objecting or contesting the data processing.
In this working paper, these general principles will be further discussed, but with some re-arrangement for reasons of logic and discussion. Some of the general data protection principles are strongly intertwined or form part of a wider cluster. Their relevance may also vary in light of arising issues in the employment relationship. Hereafter, the analysis will focus on all aforementioned principles, under the following headings:
-
Legitimacy
-
Proportionality
-
Purpose limitation
-
Transparency
-
Data quality
-
Access
-
Accountability and governance
-
Collective rights
Before doing so, however, it is proper to give a short reflection on the definitions and scope of data protection standards.
3.2. Definitions and scope
The
-
3.1. The term “personal data” means any information related to an identified or identifiable worker.
-
3.2. The term “processing” includes the collection, storage, combination, communication or any other use of personal data.
a. Personal data
The definitions used make clear that the material scope of personal data protection is very broad, regardless of the used technology. Personal data can be considered to relate to computer files, data involving a person’s name, image, address, professional status, family status, health information, education, career, income, behaviour, opinions, etc., displayed through paper based or electronically produced texts, images, and so on.
It does not matter whether personal data or sensitive or not, nor whether they are ‘private’ or ‘public’. The CJEU has made clear, for example, that also data relating to activities of a
b. Personal data processing
The different global instruments define ‘processing’ with some differences, but the concept of ‘processing’ of personal data is generally understood in a very broad sense. It may refer (cf. example article 4, 2 GDPR) to any operation or a set of operations, performed on personal data or on sets of personal data, whether or not by automated means, such as collection, recording, organisation, structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, restriction, erasure or destruction. This broad interpretation is significant for the employment context. Also the
However, the scope of application remains often limited to automated or partly automated processing activities, or to processing activities which are purely manual but which (are destined to) form part of a filing system or are intended to form part of a filing system. Manual processing of personal data outside the scope of a filing system is then not covered. This is, for example, the regime of the GDPR, where following article 4, 6 GDPR, a filing system is “any structured set of personal data which are accessible according to specific criteria, whether centralised, decentralised or dispersed on a functional or geographical basis”. Although this is a quite abstract definition and open for discussion, it may have practical results. For example, when a recruiter makes notes during a selection interview with a job applicant, these personal notes would not be covered by the GDPR when the notes are taken on paper and if the notes are not (meant to be) kept in a structured way. Would the notes be taken with a stylus pen on a tablet, or would the notes be scanned afterwards, the situation would be covered by the GDPR.
Other instruments make similar limitations to the scope of personal data protection. For example, in section 2 of
c. Legal persons
A matter of discussion is whether the right to data protection can also be enjoyed by
However, it could be questioned whether this limitation would follow from a fundamental rights logic. It is evident that also legal persons, including an organisation or a business, may enjoy fundamental rights, such as, for example, the right to collective bargaining or the freedom to conduct a business. This means that the right to data protection could also cover protection, for example, with regard to the processing of confidential information of a legal person. It would also imply that, in principle, trade unions may enjoy the right to personal data protection.
The Council of Europe’s
3.3. Legitimacy
Personal data must be processed on a legitimate basis. In other words, personal data processing has to be justified on the basis of a legitimate ground, reason or purpose. A legitimate basis requires, above all, that it is lawful.
The
-
5.1. Personal data should be processed lawfully and fairly, and only for reasons directly relevant to the employment of the worker.
a. Principle
This principle of lawfulness or legitimacy stands central in data protection law and has been further specified in different data protection instruments around the globe.
Some will refer to this as a data collection limitation principle. As the
-
7. There should be limits to the collection of personal data and any such data should be obtained by lawful and fair means and, where appropriate, with the knowledge or consent of the data subject.
There is a similar reference in the
The principle of lawfulness is explicitly provided in some of the data protection standards, for example in the
However, while the contract may be a strong and legitimate basis for data processing, the instruments, such as the
-
for the performance of a contract to which the data subject is party or in order to take steps at the request of the data subject prior to entering into a contract;
-
for compliance with a legal obligation to which the controller is subject;
-
in order to protect the vital interests of the data subject or of another natural person;
-
for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller;
-
for the purposes of the legitimate interests pursued by the controller or by a third party.
Similar references can be found in the
-
for performance of a contract to which the data subject is party or for the application of precontractual measures at their request;
-
to comply with a legal obligation;
-
for implementation of a public interest mission or relevant to the exercise of public authority vested in the controller;
-
for safeguarding the interests or rights and fundamental liberties of the data subject.
The instruments show that personal data processing is not only legitimate when employers are required or obliged to process these data, based on legal obligations, but also in case where employers have a contractual or other “legitimate interest”. Justifications may come from the employer’s legitimate interests in areas such as: recruitment and selection; the exercise of his rights, such as the right to exercise authority and control, or to direct the enterprise and plan the work, under the employment contract; payroll, administration and human resources services; health and safety obligations and actions; diversity policies, and so on.
In the
-
when necessary to provide a service or product requested by the individual;
-
or by the authority of law and other legal instruments, proclamations and pronouncements of legal effect.
This seems to be somewhat stricter than the GDPR or ECOWAS instrument. In the commentary to the
Also a public interest reason could be envisaged by an employer. It means that external circumstances to the business, such as for example a pandemic, might determine the necessity to collect some data from employees.
The
-
The purposes of the recruitment, the performance of the contract of employment, including discharge of obligations laid down by law or by collective agreements, management, planning and organisation of work, equality and diversity in the workplace, health and safety at work, protection of employer's or customer's property and for the purposes of the exercise and enjoyment, on an individual or collective basis, of rights and benefits related to employment, and for the purpose of the termination of the employment relationship.
Some countries have further specified the issue of legitimate or lawful processing for employment purposes.
The
-
The employer is only allowed to process personal data directly necessary for the employee’s employment relationship, which is connected with managing the rights and obligations of the parties to the employment relationship or with the benefits provided by the employer for the employee or which arises from the special nature of the work concerned.85
This is the case in
-
data processing is necessary for the decision on the establishment of an employment relationship or after the establishment of the employment relationship for its execution or termination or for the exercise or fulfilment of the rights and obligations of the representation of the interests of the employees resulting from a law or a collective bargaining agreement, a works agreement, or a service agreement;
-
data processing is necessary to detect criminal offences (but only if there is document reason to do this);
-
data processing necessary to comply with a works council agreement which conforms with art. 88 GDPR;
-
data processing can be allowed, some cases, when it is based on the worker’s consent.86
Another leading example is
-
Recruitment;
-
Administrative management of personnel;
-
Management of remuneration and completion of related administrative formalities;
-
Provision of professional tools to staff;
-
Organisation of work;
-
Career and mobility monitoring;
-
Training;
-
Keeping of compulsory registers, relations with employee representative bodies;
-
Internal communication;
-
Management of social assistance;
-
Performing audits, managing litigation and pre-litigation.
These examples show that the requirement of legitimacy or lawfulness, in light of purposes related to the employment context, offers still quite open justifications for the processing of personal data, including not only the necessity for the employment contract but also “legitimate interests pursued by the controller”. The
b. Non-discrimination
In light of legitimacy or lawfulness, the processing of personal data should also brought into connection with the principle of non-discrimination. It is clear that discriminatory motives, purposes or effects have to be rejected in light of the legitimacy principle.
The ILO Code of Practice provides:
-
5.10. The processing of personal data should not have the effect of unlawfully discriminating in employment or occupation.
The language of the ILO Code is particularly interesting, as it does not refer to discriminatory purposes as such, but to discriminatory ‘effects’. This is particularly relevant in light of algorithmic decision-making instruments, such as in artificial intelligence, people analytics or in the gig economy, where software may be used, although with biased and even unintended discriminatory effects.
The issue of discrimination is also of key-importance the use of data analytics and algorithmic based data processing.89
c. Consent
A specific ground of justifying personal data processing is consent. It is widely referred to in data protection instruments. However, the freedom of consent is an important issue in data protection law. Consent is obviously only a valid ground if it is, or can be, given freely. In light of this, in the employment context, a major question exists on whether consent can be a legitimate ground for personal data processing. The
An instrument that has paid much important to the freedom of consent, with subsequent and additional guidance, is the GDPR. Article 4, 11 GDPR defines consent as “any freely given, specific, informed and unambiguous indication of the data subject's wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her”. The GDPR provides furthermore that, when assessing whether consent is given freely, utmost account shall be taken of whether,
The
Examples can also be found in
The data protection authority (“CNIL”) of
The
The limited possibilities to obtain an employee’s consent for personal data protection stand in contrast with the wide range of legitimate grounds for employers to process personal data. The issue of consent may become more relevant in cases where data protection laws do not explicitly refer to general employment relevant legitimate purposes of data processing, which is for example the case in
3.4. Proportionality
a. Principle
Conditions of relevancy, adequacy, necessity, or proportionality of data processing are all related to the additional requirements and limitations of data processing, beyond lawfulness or legitimacy.
Proportionality would seem the most general and over-arching of those principles. It allows to understand the term ‘necessary’ and to distinguish it from ‘legitimacy’, since the legitimacy principle is also referring to necessity, such as in “necessary for the performance of a contract” (cf. (art.6,1,b)
This proportionality principle is explicitly mentioned as a principle in
-
5.2. Personal data collected by employers for employment purposes should be relevant and not excessive, bearing in mind the type of the employment as well as the changing information needs of the employer.
The principle of proportionality is inherent to human rights protection mechanisms. While different approaches of proportionality could be envisaged, three main explanations are broadly accepted as constituting a test of proportionality in the context of data protection law:110
-
Suitability (or adequacy): is the data processing suitable or relevant to realising the legitimate goals?
-
Necessity: is the data processing required for realising the legitimate goals? Such necessity requirement may be connected to an alternative means test: are there alternative means to realise the legitimate goals.
-
Non-excessiveness: does the measure go further than is necessary to realise the legitimate goals?
Proportionality plays also a role in the area of electronic monitoring. The European Working Party has explained the relevance of this in its
b. Data minimisation
A new notion connected with proportionality in data protection law is data minimisation.
The
-
4.1. Employers should minimise the processing of personal data to only the data necessary to the aim pursued in the individual cases concerned.
The principle is also recognised in article 5, 1, c
-
“adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed (‘data minimisation’)”
The application of this principle is relevant for different employment context issues. It may, for example, be particularly relevant in the context of electronic monitoring. Data protection instruments usually do not offer a straightforward answer to the question of whether and, if so, to what extent, employers may monitor workers in the workplace. But excessive monitoring is usually ruled out.
In light of this, the ILO Code of Practice provides that “continuous monitoring should be permitted only if required for health and safety or the protection of property” (section 6.14, 3).
c. Necessary data
Both the open grounds of lawfulness and legitimate purposes of data processing (under the ‘legitimacy principle’) and the proportionality or necessity test in light of these purposes, do not take away that a wide series of personal data can be processed in the context of employment.
The European Working Party, in its
The need to collect and further process personal data in the employment relationship, and the establishing of all kinds of employee records, is followed in different systems. In
Some jurisdictions apply a broader reasonableness test. For example in
d. Essence of a right
Related to proportionality is the idea that data protection cannot lead to such a limitation of the right to data protection, that it would compromise the essence of that right. One could also translate this into the idea that workplace privacy cannot be reduced to ‘zero’, which should not be seen as incongruent with reasonable privacy expectations or with notice or consent requirements in case of monitoring.119
The idea that an individual cannot lose the enjoyment of the essence of a right can also be connected to the limitations on individual (or collective) consent as a ground or technique to process personal data. The ILO’s reference is interesting in this regard. The
-
5.13. Workers may not waive their privacy rights.
In the Commentary to section 5.13 the ILO Code it is pointed out that “in view of the dependence of workers on their employment and the fundamental nature of privacy rights, the code states that workers may not waive their privacy rights. It is, nevertheless, recognized that privacy rights are not absolute and are balanced with competing public interests according to national law.”
While the waiver of rights has to be addressed in a nuanced way,120 restrictions to applying it have to be seen in light of the proportionality principle, as well against the background of the issue of free consent in employment relationships. Some legal systems have opted expressly to make waivers in the employment context legally impossible. For example in
3.5. Purpose limitation
a. Principle
The purpose limitation principle is a widely recognised principle of data protection. This general requirement comes back in most data protection instruments. The different international data protection standards share the rule that personal data should be processed for specific (or specified), explicit, legitimate (or lawful) purposes. Furthermore, personal data should not be further processed in a manner incompatible with those purposes.
Some data protection standards give more detailed obligations such as the
-
9. The purposes for which personal data are collected should be specified not later than at the time of data collection and the subsequent use limited to the fulfilment of those purposes or such others as are not incompatible with those purposes and as are specified on each occasion of change of purpose.
The
-
5.2. Personal data should, in principle, be used only for the purposes for which they were originally collected.
-
5.3. If personal data are to be processed for purposes other than those for which they were collected, the employer should ensure that they are not used in a manner incompatible with the original purpose, and should take the necessary measures to avoid any misinterpretations caused by a change of context.
This important data protection principle is related to verifiability and transparency of personal data processing. It also contributes to the fairness of data processing. The principle not only requires that the original purpose of data collection needs to be legitimate and clear, but also that subsequent data processing activities remain compatible with the original purposes. It means that ‘re-purposing’ is, in principle possible, but this should be compatible with the original purpose of collection.
Part of the respect of secondary (re-purposed) use of personal data of workers can be the respect for transparency and information to the worker concerned. For example, the
-
6.3. Under exceptional circumstances, where data are to be processed for employment purposes other than the purpose for which they were originally collected, employers should take adequate measures to avoid misuse of the data for this different purpose and inform the employee. Where important decisions affecting the employee are to be taken, based on the processing of that data, the employee should be informed accordingly.
b. Specifications
The purpose limitation is often illustrated in cases of monitoring or evaluation of workers. In this context, the
-
5.4. Personal data collected in connection with technical or organizational measures to ensure the security and proper operation of automated information systems should not be used to control the behaviour of workers.
The European Working Party, in
3.6. Transparency
a. Principle
According to article 8 CFREU, everyone has the right to fair personal data processing. Fairness can be seen broadly and interpreted in many ways. It is certainly connected with the question of legitimacy and with the purpose-limitation principle.
Fairness also implies transparency. This is made clear in the Council of Europe’s 2015 Recommendation which brings transparency of processing in direct relation with the need to guarantee a fair processing.123
The
-
12. There should be a general policy of openness about developments, practices and policies with respect to personal data. Means should be readily available of establishing the existence and nature of personal data, and the main purposes of their use, as well as the identity and usual residence of the data controller.
-
13. Individuals should have the right: a) to obtain from a data controller, or otherwise, confirmation of whether or not the data controller has data relating to them;
The
-
11.1. Workers should have the right to be regularly notified of the personal data held about them and the processing of that personal data.
-
6.2. If it is necessary to collect personal data from third parties, the worker should be informed in advance, and give explicit consent. The employer should indicate the purposes of the processing, the sources and means the employer intends to use, as well as the type of data to be gathered, and the consequences, if any, of refusing consent.
The commentary to this article in the ILO Code mentions that:
-
Workers will want to know what happens to their data only if they have at least a rough idea of the kind of data collected, the purposes for doing so and the potential users.
The CoE
b. Specifications
In data protection law, this principle is also brought in connection with monitoring and surveillance.
The
-
The principle of prior notice is found in almost all studied legal systems although the manner in which notification is required may still differ. Some countries will refer to the need for having clear policies at company level, others explicitly require prior and/or
European Court on Human Rights (ECtHR), in itsBarbulescu -judgement, requires ‘prior notice’ in case of electronic monitoring, meaning that “the warning from the employer must be given before the monitoring activities are initiated, especially where they also entail accessing the contents of employees’ communications”.124 In the comparative outlook in this judgement, it is reported that “with regard to monitoring powers, thirty-fourCouncil of Europe member States require employers to give employees prior notice of monitoring”.125 InHungary , for example, the Labour Code (2019) provides that employers are allowed to monitor the behaviour of workers with the use technical means, but he must notify the workers in advance and
In the GDPR, transparency requirements are provided for in articles 13 to 15, granting data subjects the right to be informed about whether personal data regarding him or her are being collected and the identity of the controller, the purposes, whether data are being transferred to recipients and so on. According to the
3.7. Data quality
Data quality is a concept that may be conceived in different ways. According to the
Also the
-
8.4. Employers should verify periodically that the personal data stored is accurate, up to date and complete.
-
8.5. Personal data should be stored only for so long as it is justified by the specific purposes for which they have been collected unless:
-
(a) a worker wishes to be on a list of potential job candidates for a specific period;
-
(b) the personal data are required to be kept by national legislation; or
-
(c) the personal data are required by an employer or a worker for any legal proceedings to prove any matter to do with an existing or former employment relationship.
-
a. Accuracy
It is obvious that accuracy of personal data in an employment or business context should be correct. This is strongly intertwined with the rights, obligations and liabilities under employment laws, including the identification of workers, the registration of performed working hours, the calculation of pay and determination of other benefits, social security and tax obligations, keeping information on training, and so on.
Three additional remarks have to be made in light of accuracy, seen the specificity of the employment context.
The first concerns the accuracy of evaluation data. The European Data Protection Supervisor has indicated that “
A second remark concerns the increasing importance of data quality in light of the use of artificial intelligence and algorithms. For example, it is clear that the accuracy of personal data, and their further processing, will play an important role in avoiding or regulating undesired outcomes, oversimplification, or discriminatory effects of algorithmic programmes.
A third aspect, related to accuracy, concerns the so-called ‘right to lie’. The
-
(6.8) Although workers are expected to provide truthful information, the code shares the view of many national courts that, especially in connection with hiring procedures, workers are justified in refusing to answer questions that are incompatible with the code. In such cases, the employer bears the responsibility for incomplete or inaccurate responses and, consequently, is not entitled to impose sanctions. Moreover, the employer should not profit from a misunderstanding on the part of the worker as to what is being asked if the worker provides additional or irrelevant information (6.9).
There are different signs that this is still a controversial point. Providing untruthful information, or providing no information, can be a problematic issue, both under contract law as under employment law, often departing from mutual information duties and good faith obligations. Nevertheless, there are nuances. Scholarship has paid some attention to it. For example, a “defensive” (permissible) right has been described and grounded, as a way to avoid an (impermissible) serious wrongdoing.130 Similarly, the use of the right to lie of a job applicant has been legitimised ‘a shield’ in response to discriminatory questions of an employer in the recruitment phase.131 According to other research, in the
b. Storage limitation
The principle that personal data should not be kept or stored longer than necessary is connected to the data quality principle, but it can also be clustered within the proportionality principle. The ‘storage limitation’ principle covers the idea that personal data need to be – and have to remain – relevant and cannot be processed for “longer than is necessary”, seen the purposes for which the personal data are processed (cf. article 5, 1, e GDPR).
In addition to the ILO references, mentioned above, the
-
13.2. Personal data submitted in support of a job application should normally be deleted as soon as it becomes clear that an offer of employment will not be made or is not accepted by the job applicant. Where such data are stored with a view to a further job opportunity, the data subject should be informed accordingly and the data should be deleted if he or she so requests.
-
13.3. Where it is essential to store data submitted for a job application for the purpose of bringing or defending legal actions or any other legitimate purpose, the data should be stored only for the period necessary for the fulfilment of such purpose.
-
13.4. Personal data processed for the purpose of an internal investigation carried out by employers which has not led to the adoption of negative measures in relation to any employee should be deleted after a reasonable period, without prejudice to the employee’s right of access until such deletion takes place.
Storage limitation has, according to the European Working Party
3.8. Access
All major data protection instruments provide that everyone has the right of access to personal data (concerning him or her) and the right to have the data rectified. Under these principles, a set of additional data protection rights for data subjects are provided in nearly all frameworks:
-
Access : the right to obtain access to personal data is universally acknowledged; -
Rectification : often a follow up right of the right to access and recognized in all data protection instruments; -
Deletion/erasure : strongly connected to access and rectification and is also a universally accepted right; -
Data portability : much less generally recognized. It appears in the Ibero-American Standards and the GDPR.
a. Access
The right to have access to personal data will inevitably play a role in the employment context, seen the number of data and records kept for reasons of personnel administration, HR and other purposes.
According to the
-
11.2. Workers should have access to all their personal data, irrespective of whether the personal data are processed by automated systems or are kept in a particular manual file regarding the individual worker or in any other file which includes workers’ personal data.
The right to have access to personal data processing may, additionally, involve a right to receive a copy of the personal data undergoing processing.136
According to the
-
11.3. The workers’ right to know about the processing of their personal data should include the right to examine and obtain a copy of any records to the extent that the data contained in the record includes that worker’s personal data.
The
-
11.4. Workers should have the right of access to their personal data during normal working hours. If access cannot be arranged during normal working hours, other arrangements should be made that take into account the interests of the worker and the employer.
-
11.5. Workers should be entitled to designate a workers’ representative or a coworker of their choice to assist them in the exercise of their right of access.
-
11.6. Workers should have the right to have access to medical data concerning them through a medical professional of their choice.
-
11.7. Employers should not charge workers for granting access to or copying their own records.
Most data protection instruments would allow for some limits towards the right to have access, or would accept conditions or a rule of reasonableness in applying the right. In some legal systems, this may be more specified for the employment context. From the data protection system in
b. Rectification
Data subjects have generally the right to obtain the rectification of inaccurate personal data concerning them. This also has a logical relevance for the employment context. However, some discussions may arise, certainly with regard to whether data are complete or whether evaluation data can be rectified.
The
-
11.9. Workers should have the right to demand that incorrect or incomplete personal data, and personal data processed inconsistently with the provisions of this code, be deleted or rectified.
-
11.10. In case of a deletion or rectification of personal data, employers should inform all parties who have been previously provided with the inaccurate or incomplete personal data of the corrections made, unless the worker agrees that this is not necessary.
-
11.11. If the employer refuses to correct the personal data, the worker should be entitled to place a statement on or with the record setting out the reasons for that worker’s disagreement. Any subsequent use of the personal data should include the information that the personal data are disputed, and the worker’s statement.
The ILO’s principle that the worker should be entitled to put a statement in the relevant record, with an indication of the worker’s disagreement with the personal data, finds support in article 16 of the GDPR, which guarantees “the right to have incomplete personal data completed, including by means of providing a supplementary statement”.
c. Evaluation data
A point of discussion relates to evaluation data. One may understand the issue in terms of worker evaluations made by employers or supervisors, or test results, are often kept in personnel files.
The international data protection standards do not seem to give a direct answer on how to treat evaluation data, while different
The
-
11.12. In the case of judgmental personal data, if deletion or rectification is not possible, workers should have the right to supplement the stored personal data by a statement expressing their own view. The statement should be included in all communications of the personal data, unless the worker agrees that this is not necessary.
The
-
11.3. The right of access should also be guaranteed in respect of evaluation data, including where such data relate to assessments of the performance, productivity or capability of the employee when the assessment process has been completed at the latest, without prejudice to the right of defense of employers or third parties involved. Although such data cannot be corrected by the employee, purely subjective assessments should be open to challenge in accordance with domestic law.
Under the data protection law of
In the case of
d. Erasure
The right to erasure implies the right to have data removed. This could mean different things in an employment context. It could imply that some data have to be partly removed, so that another and more proper picture is given of the available information.
Closely connected with the right to access and rectification is the ‘right to be forgotten’. This right was recognised, though in a specific context, by the European Court of Justice (CJEU) in the widely known
This right to erasure and to be forgotten could be explored as a right for employees to demand that information about their past employment would not be brought under wide (public) attention after a certain period of time, as was illustrated by the ECtHR in the case of
e. Data portability
Data portability refers to the right to receive personal data from a data controller and/or to transfer those data to another controller. It has become relevant with the rise of information networks and the network economy, where not only networks of enterprises, people and services are strongly interconnected.146
Notwithstanding its growing relevance, the principle of data portability has not yet been expressed in specific terms in many international data protection standards. Only the
-
Principle 42. 4. Every person shall have the right to exercise autonomy in relation to their personal information by law and to obtain and reuse their personal information, across multiple services, by moving, copying or transferring it.
The GDPR (article 20(1)) states that:
-
“The data subject shall have the right to receive the personal data concerning him or her, which he or she has provided to a controller, in a structured, commonly used and machine-readable format and have the right to transmit those data to another controller without hindrance from the controller to which the personal data have been provided”.
Data portability may for example be relevant in cases of evaluation data, which could be relevant for new employment positions that workers take up with other employers. It may also be relevant specifically in the context of the gig economy and platform work, where workers have an interest in moving rating and ranking systems from one platform to another.147
3.9. Accountability and governance
a. Principle
Not all international data protection standards refer to the accountability principle, although a majority do so in one way or another. The accountability principle can be widened as a data protection governance principle, in order to group a number or a set of connected data protection principles.
The
The GDPR provides that data controllers must be held responsible and demonstrate compliance for living up to standards on lawfulness of data protection (cf. art. 5.2 GDPR). The accountability principle is also brought forward in the
-
14. A data controller should be accountable for complying with measures which give effect to the principles stated above.
However, accountability should not be understood in a too narrow sense. An interesting concept added to accountability is that of “privacy management”, explicitly referred to in the
-
15. A data controller should: a) have in place a privacy management programme.
The idea is that data controllers must give effect to data protection principles and provide for appropriate safeguards based on privacy risk assessment, ongoing monitoring and periodic assessment. A broader notion of data protection governance can be construed. Data controllers and other involved parties thus bear a wider responsibility which requires them to be accountable, to provide for sufficient security measures related to data processing activities, as well as to manage and regularly assess data processing activities.
b. Security
The
-
11. Personal data should be protected by reasonable security safeguards against such risks as loss or unauthorised access, destruction, use, modification or disclosure of data.
c. Specifications
The following references in the
-
5.7. Employers should regularly assess their data processing practices: (a) to reduce as far as possible the kind and amount of personal data collected; and (b) to improve ways of protecting the privacy of workers.
-
5.9. Persons who process personal data should be regularly trained to ensure an understanding of the data collection process and their role in the application of the principles in this code.
-
7.1. Employers should ensure that personal data are protected by such security safeguards as are reasonable in the circumstances to guard against loss and unauthorized access, use, modification or disclosure.
Also the
-
4.2. Employers should develop appropriate measures, to ensure that they respect in practice the principles and obligations relating to data processing for employment purposes. At the request of the supervisory authority, employers should be able to demonstrate their compliance with such principles and obligations. These measures should be adapted to the volume and nature of the data processed, the type of activities being undertaken, and should also take into account possible implications for fundamental rights and freedoms of employees.
-
20.1. Employers or, where applicable, processors, should carry out a risk analysis of the potential impact of any intended data-processing on the employees’ rights and fundamental freedoms and design data processing operations in such a way as to prevent or at least minimise the risk of interference with those rights and fundamental freedoms.
-
20.2. Unless domestic law or practice provides other appropriate safeguards, the agreement of employees’ representatives should be sought before the introduction or adaptation of ICTs where the analysis reveals risks of interference with employees’ rights and fundamental freedoms.
Part of the broader accountability and management principle is that not only the employer, but also other parties, either part of the larger organisational setting, or third parties, have respect for the data protection principles. As the
Accountability is also a matter of the internal operations of a data controller, such as an employer. In an HR context, the treatment of personal data should involve the accountability of different involved actors. The principle is thus directed to a wide range of addressees.
The
-
12.3. The personnel administration, as well as any other person engaged in the processing of the data, should be kept informed of such measures, of the need to respect them and of the need to maintain confidentiality about such measures as well.
The
-
5.12. All persons, including employers, workers’ representatives, employment agencies and workers, who have access to personal data, should be bound to a rule of confidentiality consistent with the performance of their duties and the principles in this code.
-
13.1. If the employer uses employment agencies to recruit workers, the employer should request the employment agency to process personal data consistently with the provisions of this code.
d. Impact assessment
The data protection impact assessment (DPIA) has been promoted as an important component of some personal data protection instruments. It is often relate to data processing with ‘high risk’ impact. An example of ‘high risk’, according to the European Working Party’s
For example, the
-
Where a type of processing in particular using new technologies, and taking into account the nature, scope, context and purposes of the processing, is likely to result in a high risk to the rights and freedoms of natural persons, the controller shall, prior to the processing, carry out an assessment of the impact of the envisaged processing operations on the protection of personal data. A single assessment may address a set of similar processing operations that present similar high risks.
Such ‘DPIA’ may be relevant in the context of the employment relationship, for example in cases where employers are intending extensive monitoring of workers.150
An assessment of data protection practices is also mentioned in the
-
5.7. Employers should regularly assess their data processing practices: (a) to reduce as far as possible the kind and amount of personal data collected; and (b) to improve ways of protecting the privacy of workers.
e. Privacy by design and default
Privacy by design and by default are a specific approach in relation to accountability applied by the GDPR. Article 25.2 of the GDPR provides:
-
The controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed.
This means that data controllers have to be proactive and making continuous assessment of the privacy impact of technology.151 Preamble 78 of the GDPR explains that it entails that appropriate technical and organisational measures are taken to ensure that the requirements of the GDPR are met: “the controller should adopt internal policies and implement measures which meet in particular the principles of data protection by design and data protection by default. Such measures could consist, inter alia, of minimising the processing of personal data, pseudonymising personal data as soon as possible, transparency with regard to the functions and processing of personal data, enabling the data subject to monitor the data processing, enabling the controller to create and improve security features. When developing, designing, selecting and using applications, services and products that are based on the processing of personal data or process personal data to fulfil their task, producers of the products, services and applications should be encouraged to take into account the right to data protection when developing and designing such products, services and applications and, with due regard to the state of the art, to make sure that controllers and processors are able to fulfil their data protection obligations.”152
These principles are clearly still put in general terms and will have to be translated to the own context of the data controller. There is not yet much guidance on the implementation of this principle through the case law of the European courts. However, the Google Spain case of the European Court of Justice (CJEU) has made clear that the principle will likely play a role in the application of algorithmic decision-making processes.153 In rejecting that search engines or algorithms are value-neutral, the Court confirmed that systems like these should be designed in a privacy friendly way.154
An example in the employment context is, according to the European Working Party
3.10. Collective rights
With regard to collective rights, the
-
12. Collective rights
-
12.1. All negotiations concerning the processing of workers’ personal data should be guided and bound by the principles in this code that protect the individual worker’s right to know and decide which personal data concerning that worker should be used, under which conditions, and for which purposes.
-
12.2. The workers’ representatives, where they exist, and in conformity with national law and practice, should be informed and consulted:
-
(a) concerning the introduction or modification of automated systems that process worker’s personal data;
-
(b) before the introduction of any electronic monitoring of workers’ behaviour in the workplace;
-
(c) about the purpose, contents and the manner of administering and interpreting any questionnaires and tests concerning the personal data of the workers.
-
In the employment context, a reference to collective rights may be relevant in different perspectives.
a. Regulation
As data protection laws are generally not designed for, although applicable to, the employment context, the creation of more specific and adapted rules or principles may be envisaged or desirable.
The
-
5.11. Employers, workers and their representatives should cooperate in protecting personal data and in developing policies on workers’ privacy consistent with the principles in this code.
The idea of adapting the principles to the employment context has been explicitly suggested by the European data protection legislator. Article 88 of the
-
1. Member States may, by law or by collective agreements, provide for more specific rules to ensure the protection of the rights and freedoms in respect of the processing of employees' personal data in the employment context, in particular for the purposes of the recruitment, the performance of the contract of employment, including discharge of obligations laid down by law or by collective agreements, management, planning and organisation of work, equality and diversity in the workplace, health and safety at work, protection of employer's or customer's property and for the purposes of the exercise and enjoyment, on an individual or collective basis, of rights and benefits related to employment, and for the purpose of the termination of the employment relationship.
-
2. Those rules shall include suitable and specific measures to safeguard the data subject's human dignity, legitimate interests and fundamental rights, with particular regard to the transparency of processing, the transfer of personal data within a group of undertakings, or a group of enterprises engaged in a joint economic activity and monitoring systems at the work place.
On 24 June 2020, the
It must be noted that also workers’ representative bodies may be a forum where the translation and application of data protection rights and principles may be modelled to the relevant context of the work environment.
Some
For example, in
In
b. Legitimisation
Another approach of involving collective rights would concern the sphere of legitimising the processing of personal data in the employment context.
This dimension can be connected with the issue of consent. The intermediation or the involvement of a collective institution, such as a workers’ representation body or a collective bargaining agreement, may lead to more robust guarantees. For example,
c. Governance
Another dimension is incorporating worker representative groups in data management and data protection impact assessment. It is referred to by section 12.2 of the ILO Code of Practice, mentioning that “
The European Working Party
The
-
21. (…) employers should (…) c. consult employees’ representatives in accordance with domestic law or practice, before any monitoring system can be introduced or in circumstances where such monitoring may change. Where the consultation procedure reveals a possibility of infringement of employees’ right to respect for privacy and human dignity, the agreement of employees’ representatives should be obtained;
Collective governance, with the involvement of workers’ representatives, is suggested in light of the rise of artificial intelligence. Seen the impact of technology, including AI systems, on data protection in the work context, it has been recommended that “
d. Representation
Another dimension is the role of workers’ representatives in the exercise of rights of data subjects. In principle, data protection standards offer individual rights to data subjects. Within this context, workers, as data subjects, may be relying on support and assistance of their representatives.
The
-
11.8. Unless provisions of domestic law provide otherwise, an employee should be entitled to choose and designate a person to assist him or her in the exercise of his or her right of access, rectification and to object or to exercise these rights on his or her behalf.
In
The data protection standards could also be seen as construing data protection rights for workers’ representatives. Instead of individual rights, data protection principles could be designed to foster access to certain data processing activities through collective channels. For example, a need for transparency arises in the field of algorithms used in the work environment (‘explaining the algorithm’). At the same time, not only may algorithms be complex, some information may be sensitive or confidential on the part of the employer. Giving algorithmic transparency to worker representatives, rather than to workers individually, may be a pathway to overcome these issues.
A more collective view may be derived from the CoE
-
7.1. In accordance with domestic law and practice, or the terms of collective agreements, personal data may be communicated to the employee’s representatives, but only to the extent that such data are necessary to allow them to properly represent the employee’s interests or if such data are necessary for the fulfilment and supervision of obligations laid down in collective agreements.
-
7.2. In accordance with domestic law and practice, the use of information systems and technologies for the communication of data to employees’ representatives should be subject to specific agreements that set out, in advance, transparent rules prescribing their use and safeguards to protect confidential communications, in accordance with principle 10.
Artificial intelligence and data protection
4.1. Introduction
The rise of new technologies, based on artificial intelligence and robotics, brings new challenges for both the world of work and data protection law. Data protection standards address the issue of AI. Many data protection concepts and principles take into account new technological developments. However, the future outlook of technology brings new challenges to the key-principles set out in the sections above.
Artificial intelligence raises new issues. AI systems may be working on the premise that massive and combined data are required and applied for different purposes, linked in manners not necessarily limited, for purposes or results not yet known, while data protection relies on purpose limitation, data minimisation and transparency. Artificial intelligent systems are sometimes seen as ‘black boxes’,165 running on complex systems, connecting different types and sources of data, with results that remain often opaque. A difficulty is to understand and explain how intelligent machines or decision tools use combined data to infer certain decisions.166 This explainability problem brings the idea to the fore that data subjects need to be entitled to “meaningful explanations to understand a given decision, grounds to contest it, and advice on how the data subject can change his or her behaviour or situation to possibly receive a desired decision”.167 Digital platforms show the effects of such work-related algorithms.168
An important dimension is people analytics: “the use of analytical techniques such as data mining, predictive analytics and contextual analytics to enable managers to take better decisions related to their workforce”.169 Its core lies not only in the collection of data, but in the analysis of these data. Legal scholars have indicated that three main concerns arise in this context, even when departing from personal data which are
There are challenges for data protection standards. Questions may arise whether results generated by people analytics are accurate, adequate and reliable. There are also issues with regard to discriminatory effects. It may be that “management-by-algorithm and artificial intelligence” does not lead to more objective and bias-free HR practices.172 A combination of different conditions may assumed to be required in order to be implemented before making proper use of it, including involvement of workers and data governance.173
Big data processing is another area where data protection standards are challenged. Processing may rely on non-personalised or non-individualised, rather aggregated, data. The question is how this relates to the common legal definitions used in the personal data concept and the scope of data protection law. At the same time, the body of knowledge created by big data may refer to groups of individuals and the right to privacy of these groups, or individuals belonging to it, may need to be covered in other ways.174 Another issue relates to the data minimisation principle. Big data relies on wide ranging or massive data processing and on accumulation of data, but in terms of volume as well as in time. It may be questioned how this fits into a data quality and data minimisation principle. A third aspect concerns the use of big data. A great deal of big data is connected with smart devices or the ‘internet of things’, which may leave it open what the future purposes of data uses are, which brings questions in light of the legitimacy and purpose limitation principles.175
Wearables, smart watches, smart belts, smart gloves, are part of the new world of work. These features can bring additional value to work. They may contribute to a more healthy and safe workplace. Some smart gloves allow sign language users to communicate with others without the assistance of an interpreter.176 However, they also may become a feature of total control. For example, socio-metric badges, ID cards with integrated sensors, not only tracking location but also measuring different interpersonal variables, such as speech patterns and body movements, which results in a total picture of human interaction at work.177 It obviously provides opportunities to assess collaboration, task organisation and productivity. Self-evidently, it poses new privacy problems in light of monitoring and surveillance and the fair processing of personal data.
An additional new level in ‘Industry 4.0.’ concerns robotics and cyber-physical systems (CPS). It concerns technical systems in which computers, robots and machines interact with the physical world. Smart or intelligent robots, understood as mechanical creatures which can function autonomously178, create new questions, for example in light of human-machine interaction. When humans and robots work together, there are evident reasons to carefully consider conditions and quality of work. They come with predicted effects such as a “lack of privacy, extra monitoring and surveillance in the workplace, and a new sense of alienation”.179
4.2. Data Protection Standards and AI
a. Applying or adapting data protection
Data protection standards can be a basis for responding to the above broad outlook and challenges. In some cases, data protection standards have to be re-interpreted, but may appear to be crucial in regulating AI. An example is ‘data quality’180, which may address AI related issues like bias, discriminatory effects, inaccurate or coincidental correlations between data, simplified conclusions, lack of context of data, or merely irrelevant data processing.181
Some data protection instruments specifically address or anticipate the phenomenon of algorithmic processes. Data protection law addresses, for example, profiling and/or automated decision-making and the concerns related to the ability to make decisions by technological means without human involvement.
b. Profiling and automated decision-making
With regard to profiling and automated decision-making, a series of rights of data subjects (workers) are guaranteed in some data protection instruments. Three main rights must be mentioned:
-
the right not to be subject to it;
-
the right be informed about it;
-
the right to have a human interface.
It is mainly the GDPR which provides the most detailed provisions in this field. However, the subject received also attention in other standards, including the Code of Practice from the ILO.
Right not to be subject
The
-
“decisions concerning a worker should not be based solely on the automated processing of that worker’s personal data.”
Article 22, 1
The
-
11.4. An employee should not be subject to a decision significantly affecting him or her, based solely on an automated processing of data without having his or her views taken into consideration.
The first important aspect of this is a right not to be subject (thus to object) to profiling. This is at least following from article 21.1
The European Data Protection Working Party gives the following example:
-
A business advertises an open position. As working for the business in question is popular, the business receives tens of thousands of applications. Due to the exceptionally high volume of applications, the business may find that it is not practically possible to identify fitting candidates without first using fully automated means to sift out irrelevant applications. In this case, automated decision-making may be necessary in order to make a short list of possible candidates, with the intention of entering into a contract with a data subject.182
The approach of the GDPR is thus that the employment contract can provide for justifications to involve profiling and/or fully automated decision-making. The example of the Working Party shows that a degree of leeway is given to employers. A manual processing of this number of candidates is, obviously, not very practical. But the question is from how much job applications onwards it is not necessary anymore to apply automated decision-making.
Right to be informed
The second element of protection is the right to be informed about automated decision-making. It is obviously connected with the principle of transparency.
The
-
11.5. An employee should also be able to obtain, upon request, information on the reasoning underlying the data processing, the results of which are applied to him or her.
According to article 15,1,h
-
“the growth and complexity of machine-learning can make it challenging to understand how an automated decision-making process or profiling works. The controller should find simple ways to tell the data subject about the rationale behind, or the criteria relied on in reaching the decision. The GDPR requires the controller to provide meaningful information about the logic involved, not necessarily a complex explanation of the algorithms used or disclosure of the full algorithm. The information provided should, however, be sufficiently comprehensive for the data subject to understand the reasons for the decision.”183
One may wonder to what extent an explanation on, for example, algorithmic decision-making to which workers are subject, can be made in clear and accessible terms in practice. The comments of the European Working Party indicate that these processes are very complicated and difficult to understand for non-experts. There is a main issue of informational asymmetry.184 In the employment context, one could envisage organizing this transparency rather through workers’ representatives, allowing them to acquire the necessary expertise.185
A case for a more collective right to be informed on algorithmic decision can be construed on the nature and complexity of the problem. When it comes to give some disclosure about the potential discrimination effects, or their avoidance, or that fact that some effects or data do not just concern an individual but rather a group, a collective based approach of the right to have access and information can be envisaged.186
Right to human interface
Under article 22, 3 GDPR (relating to fully automated decision-making) the data subject (the worker) has at least the right to obtain human intervention on the part of the controller (the employer). According to the European Working Party, human intervention is a key element in the GDPR’s data protection and any review (of AI decisions) must be carried out by someone who has the appropriate authority and capability to change the decision.187
It is, however, not provided what this human interface must mean or how far such intervention in the decision-making needs to go in order to make it valid. In any case, a worker has additionally, according to article 22,3 GDPR, the right to express his or her point of view and to contest the decision. Expressing one’s point of view means, according to the GDPR’s recital, the right to obtain an explanation of the decision reached after such assessment and to challenge the decision.188
The rights under the GDPR are very consistent with the demands for a human-in-command approach in the future of work. According to the Global Commission on the Future of Work:
-
“It also means adopting a ‘human-in-command’ approach to artificial intelligence that ensures that the final decisions affecting work are taken by human beings.”189
4.3. From data protection to AI regulation
The increasing development of AI systems and related technology shows the relevance of general data protection standards. At the same time, it is understood that there may be limits to data protection standards and new, complementary formulas or standards are needed.190 In light of this, many initiatives have been taken in the area of AI.
Some have been initiated through the development of privacy and data protection approach. An example is the resolution from the
Also within international and regional organisations, new AI initiatives are in full development. In May 2019, the OECD adopted a Recommendation on Artificial Intelligence.192 The ‘principles on AI’ in this recommendation are the first international standards agreed by governments.193 UNESCO started an AI and ethics programme in order to reach consensus on a recommendation. The UNESCO Recommendation on the ethics of artificial intelligence was adopted in November 2021.194 The Council of Europe started work on AI195 and adopted a Recommendation in 2020 on the human rights impacts of algorithmic systems.196 The EU took steps to establish a regulatory framework,197 following the guidelines of an High-Level Expert Group (HLEG) on trustworthy AI198 and the European Commission’s ‘White Paper on Artificial Intelligence’ (2020).199 It has led to a legislative proposal for a Regulation on artificial intelligence, known as the ‘AI Act’.200 The initiatives have in common that they focus on human-centred values, fairness, inclusivity and accountability.
Conclusion
This working paper looked into international and regional legal frameworks relating to personal data protection. The aim was to give a global and updated outlook of the main and basic principles in this area with a view to improve their understanding in the employment context. Taking into account legal sources and principles from a comparative perspective, the focus has been on data protection principles with a general nature embedded in a global
The first international and regional data protection standards arose during the 1980’s and 1990’s. The landscape evolved strongly over time with different instruments coming into place within various international and regional organisations, as well as in many countries around the world. Throughout the years, data protection standards have evolved, received new attention or underwent revision, taking into account new developments related to legal, societal and technological change. In 2016, the European Union updated its legal framework with the adoption of the ‘General Data Protection Regulation’ (GDPR), twenty years after the ILO created its Code of Practice on the protection of workers' personal data.201 The endorsement of data protection standards has been a more recent phenomenon in different parts of the world, such as in the Asian, African or Latin-America and Pacific regions. These initiatives brought a new dynamic in the standard-setting environment. While often reflecting the OECD model or European standards, global and regional legal development shows own pathways within nevertheless converging norms.
What all standards have in common is reliance on general and basic principles of data protection. There is a tendency for shared common ground at the level of principles, as identified in this study. In light of this, it has to be noted that, designed to respond to a variety of circumstances, general data protection principles are applicable in the employment context. It leads to a double finding. On the one hand, the proper knowledge of general data protection principles is key to understand their application in the employment relationship. On the other hand, the generality and abstract level of some data protection principles brings a need for further specification or guidance. The ILO Code of Practice is an example of this at global level, along with other, regional attempts to specify the general principles for the employment relationship, such as within the Council of Europe.
While both the right to privacy and the right to data protection are interdependent and recognised as a human right, the efforts towards more sector-specific principles of data protection standards is also a matter of adaptation and (regional) specificity. The existing range of country practices and experiences shows that a global recognition of general principles goes hand in hand with applications in diverse contexts. That may be more apparent in the sphere of the employment relationship, where local context remains highly relevant. Notwithstanding this, the challenges of technology and the world of work cover extremely global phenomena. It makes the case for interdependence or convergence of standards rather realistic.
In this working paper, general principles have been discussed, based on common ground taken from major regional initiatives around the world. Their relevance may vary in light of arising issues in the employment relationship, although they can be considered to be major benchmarks. The outlook shows how general data protection standards may operate in the employment relationship, how they are connected, and whether more specific guidance may be useful. The legitimacy of personal data processing in the employment context is fully recognised under the existing data protection standards. But a wide range of conditions and balancing will be required, taking into account the characteristics of the employment relationship. Standards show a general reluctance to ground personal data processing on individual consent. At the same time, legitimacy of personal data processing related to rights and obligations, but also interests related to the employment relationship, are fully recognised.
Some of the data protection standards are well in line with the body of knowledge in the field of human rights protection. Lawfulness, legitimate purposes, proportionality and transparency are key-conditions applied to the limitation or conditioning of human rights. Other data protection principles will shed new light on employment related issues. For example, the purpose limitation principle and the data minimisation principle pose important limits to personal data processing and will require both a fairness test and a privacy impact assessment before any processing takes place. The overview shows areas where these principles increasing play a role.
This brings us to some limitations of this working paper. The general data protection principles play also a role in specific data processing contexts, such as the processing of health related data or the area of monitoring and surveillance at the workplace. This study does not neglect the importance of data protection standards in these fields, on the contrary. However, the working paper dealt with some restrictions and it suggests that understanding the
In this working paper, taking into account its scope, a brief outlook is given with regard to the role of artificial intelligence, robotisation and similar forms of automation, on data protection standards. Globally, with various rising initiatives, the development of standards is in full development. When looking into the future of AI and sophisticated automation, it becomes clear that the role of privacy and data protection within the context of employment will not be diminishing. It also defines the main challenge for data protection law. The general character of data protection principles gives them adaptability to new technological contexts, making them suitable to last sufficiently long. But broad discussions need to continue in search for instruments and standards specifically dealing with AI and related technology. A complementary set of specific AI related standards is a valuable way forward.
A final note must go to the future of work in connection with the right to privacy. Not only will the world of work be further subject of change, with it will also follow a changing work-life relationships. Technology is an important intervening factor in this context. This is seen when modern communication tools are used, including e-mail, internet, but also social media, for both professional and private purposes. New ways of working, including telework and the creation of virtual workspaces, have further challenged the work-life boundary. As indicated above, AI and robotics will also raise new challenges for the way work is defined or how it interacts with our human essence. This is why this study has kept personal data protection against the wider horizon of the right to privacy. This embedded context provides additional value in shaping or rethinking legal concepts and frameworks in light of a human-centred agenda.
References
Abdulrauf, L. 2020. “Data Protection in the Internet: South Africa.” In
Alunge, Rogers. 2020. “Africa’s Multilateral Legal Framework on Personal Data Security: What Prospects for the Digital Environment?” in
Bayamlioglu, E. 2018. “Contesting Automated Decisions: A View of Transparency Implications”,
Berg, J., M. Furrer, E. Harmon, U. Rani, and M. Six Silberman. 2018.
Bodie, M.T., M.A. Cherry, M.L. McCormick, and J. Tang. 2017. “The Law and Policy of People Analytics.”
Boshe, Patricia. 2016. “Protection of Personal Data in Senegal.” In Makulilo 2016, 259-275.
Bygrave, Lee A. 2014.
—. 2017. “Data Protection by Design and by Default: Deciphering the EU’s Legislative Requirements.”
Choudary, Sangeet Paul. 2018. “The architecture of digital labour platforms: Policy recommendations on platform design for worker well-being”, ILO Future of Work Research Paper No. 3.
De Stefano, V. 2020. “Algorithmic Bosses and What to Do About Them: Automation, Artificial Intelligence and Labour Protection.” In
Drahokoupil, Jan and Brian Fabo. 2019. “Outsourcing, Offshoring and the Deconstruction of Employment: New and Old Challenges.” In
Eurofound and ILO. 2017.
Fantoni-Quinton, Sophie and Anne-Marie Laflamme. 2017. “Medical selection upon hiring and the applicant's right to lie about his health status: A comparative study of French and Quebec Law.”
Finkin, M.W. 2003. Privacy in Employment Law. Washington DC: BNA Books.
Freedland, Mark. 1999.
Graham, M. and J. Woodcock. 2018. “Towards a Fairer Platform Economy: Introducing the Fairwork Foundation.”
Greenleaf, Graham. 2012. “The Influence of European Data Privacy Standards outside Europe: Implications for Globalization of Convention 108.”
—. and B. Cottier. 2020
Hendrickx, Franck. 1999.
—. (ed.). 2002.
—. (ed.). 2003.
—. 2014. “Employment Privacy”, in
—. 2019a. “From digits to robots: the privacy-autonomy nexus in new labour law machinery.”
—. 2019b. “Privacy 4.0. at work: regulating employment, technology, and automation.”
ILO. 1997
—. 2019.
Jasmontaite, L., I. Kamara, G. Zanfir-Fortuna and S. Leucci . 2018. “Data Protection by Design and by Default: Framing Guiding Principles into Legal Obligations in the GDPR”.
Kilhoffer, Z., W. De Groen, K. Lenaerts, I. Smits, H. Hauben, W. Waeyaert, E. Giacumacatos, J.-P. Lhernould, and S. Robin-Olivier. 2020.
Kim, Taemie, Erin McFee, Daniel O. Olguin, Ben Waber, Alex Pentland. 2012. “Sociometric badges: Using sensor technology to capture new forms of collaboration.”
Krotoszynski, Ronald J. 2016.
Lehuedé, Héctor J. 2019.
Long, William, Francesca Blythe, Alan C. Raul. 2020. “EU Overview”, in
Makulilo, Alex B. 2016.
Mangan, David. 2019. “Beyond Procedural Protection: Information Technology, Privacy, and the Workplace.”
Mazur, Joanna. 2018. “Right to access information as a collective-based approach to the GDPR’s right to explanation in European law.”
Mittelstadt, Brent. 2018. “From Individual to Group Privacy in Biomedical Big Data.” In
Moore, P. 2020.
Murphy, R. 2000.
Navarro-Arribas, Guillermo and Vicenç Torra. 2015.
Nouwt, S., B.R. de Vries, and C. Prins. 2005.
Peeters, Tina, Jaap Paauwe and Karina Van De Voorde. 2020. “People analytics effectiveness: developing a framework.”
Purtova, Nadezhda. 2017. “Private Law Solutions in European Data Protection: Relationship to Privacy, and Waiver of Data Protection Rights.”
Raepsaet, F. 2011. “Les attentes raisonnables en matière de vie privée.”
Schwab, K. 2017.
Shapiro, Carl and Hal R. Varian. 1998.
Shrivastava, S., K. Nagdev, and A. Rajesh. 2018. “Redefining HR using people analytics: the case of Google”,
Silva, J. 2020. “Reasonable expectations of privacy in the digital age.”
Solove, D. 2001 “Privacy and Power: Computer Databases and Metaphors for Information Privacy.”
Stewart, Hamish. 2019. “A Juridical Right to Lie.”
Strandburg, Katherine J. 2019. “Rulemaking and inscrutable automated decision tools.”
Traça, João Luís and Lídia Neves. 2016. “Data Protection in Mozambique: Inception Phase.” In Makulilo 2016. 363-367. Springer.
Upchurch, Martin and Phoebe Moore. 2018. “Deep automation and the world of work.” In
Wachter, Sandra, Brent Mittelstadt, and Chris Russell. 2018. “Counterfactual Explanations Without Opening the Black Box: Automated Decisions and the GDPR.”
Wallach, S. 2011. “The Medusa Stare: Surveillance and Monitoring of Employees and the Right to Privacy”.
Walters, Robert, Leon Trakman and Bruno Zeller. 2019.
Warren, S.D, and L.D. Brandeis. 1890. “The right to privacy.
Whitman, James Q. 2004. “The Two Western Cultures of Privacy: Dignity versus Liberty.”
Acknowledgements
Special thanks goes to the following experts for their input with country specific information: Australia: Anna Chapman, Melbourne Law School / Brazil: Ana Virginia Moreira Gomes, Universidade de Fortaleza / Bulgaria: Vasil Mrachkov and Yaroslava Genova, University of Plovdiv / Croatia: Ivana Vukorepa, University of Zagreb / Czech Republik: Martin Stefko, Charles University Prague Estonia / Germany: Elena Gramano, Bocconi University (free format) / Greece: Costas Papadimitriou, University of Athens / Hungary: Tamás Gyulavári, Pázmány Péter Catholic University / Ireland: David Mangan and Michael Doherty, Maynooth University / Israel: Lilach Lurie, Tel-Aviv University / Italy: Elena Gramano, Bocconi University / Luxembourg: Luca Ratti, University of Luxembourg / New Zealand: Paul Roth, University of Otago / Poland: Marta Otto, University of Lodz / Portugal: Rita Canas da Silva, Católica University Lisbon / Romania: Raluca Dimitriu, University of Economic Studies Bucharest / Slovakia: Andrej Poruban, Alexander Dubcek University of Trencin / Slovenia: Barbara Kresal, University of Ljubljana / Spain: Adrian Todoli Signes, University of Valencia / Sweden: Annamaria Westregard, Lund University.