Thermal cameras: Does GDPR apply during the pandemic?
The most common method for measuring body temperature to stop the spread of Covid-19 has been a standard thermometer. However, due to the virus’s fastspreading nature, thermal cameras won many new proponents in Europe. They argue that thie techonology offers an incomparably efficient way of detecting fevers, and thus infections. But what exactly is the legal basis for this type of automatic data profiling? And how has it clashed with citizens’ rights provided under the EU’s General Data Protection Regulation (GDPR)?
We have experienced these issues in practice in the EU for some time. Amazon has measured the temperature of its employees through thermal cameras in warehouses. Meanwhile, actors in the Republic of Croatia pushed for a similar solution during th epandemic. For instance, two thermal cameras are being installed at the General Hospital in Pula. Other hospitals in Croatia are make similar plans. How big was the impact of such general digitalisation and the subsequent processing of health data on our privacy and basic human rights.
Processing personal data during the coronavirus crisis
The GDPR normative framework imposes numerous obligations towards data controllers and data processors when dealing with natural persons’ personal data. Technical and organisational measures for processing personal data legally and many others obligations are the responsibility of the controller. Extraordinary circumstances, such as the pandemic, increase the complexity of processing because of the additional obligations that controllers and processors must fulfil. This is especially true regarding health data. Article 4 of the GDPR defines data concerning health as: “personal data related to the physical or mental health of a natural person, including the provision of health care services, which reveal information about his or her health status”.
However, the coronavirus pandemic cast doubt on the legal basis for processing personal data during extraordinary circumstances. According to GDPR the basic principle of personal data processing is that it must be lawful, regardless of external factors. The legal basis for processing personal data during the pandemic can be found in GDPR articles 6 and 9.
As the virus spread last spring, the European Data Protection Board (EDPB) and state supervisory authorities acted relatively quickly. On 20 March 2020 EDPB issued a statement in which GDPR articles 6 and 9 were recognised as the relevant legal basis for processing personal data during the pandemic. As long as extraordinary circumstances persisted processing would be:
“necessary in order to protect the vital interests of the data subject or of another natural person”, also important for ‘the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller’, as well as ‘necessary for the purposes of the legitimate interests pursued by the controller or by a third party.”
However, since certain data categories are more sensitive than others, processing is generally prohibited. This applies to personal data revealing racial or ethnic origins, political opinions, religious or philosophical beliefs and trade union membership. It also covers genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health and data concerning a natural person’s sex life or sexuality. However, extraordinary circumstances call for extraordinary measures. Therefore, GDPR article 9 provides that personal health data may be processed under certain conditions. Namely if “processing is necessary for reasons of substantial public interest, on the basis of Union or Member State law”. It also allows for processing if it is
“necessary for the purposes of preventive or occupational medicine, for the assessment of the working capacity of the employee, medical diagnosis, the provision of health or social care or treatment or the management of health or social care systems and services on the basis of Union or Member State law”.
Furthermore, processing is possible when it entails “protecting against serious cross-border threats to health or ensuring high standards of quality and safety of health care and of medicinal products or medical devices”. As long as this processing follows EU or state law.
As stated, measurement of with a thermometer is the basic way in which currently health data is processed. Body temperature measurements with thermometers in Croatia have been conducted strictly on an individual basis and the collected data has not been stored or stored only for a shorter period of time – in accordance with the principle of data storage limitation. As with all personal health data, the legal basis for the usual measurement of the body temperature can be found in GDPR articles 6 and 9, which in our opinion represents a sufficient and flexible legal basis for such methods.
Thermal cameras and article 22 of GDPR
Thermal camera can serve at least two different purposes during a pandemic. They can a) rapidly detect a lot of people with a high body temperature and b) classify these data subjects into specific categories, such as “potentially infected” or “uninfected”.
However, the excessive introduction of thermal cameras into our everyday life has the potential to trigger GDPR article 22. This article concerns the profiling of data subjects when coupled with automated decisionmaking. The title of the article 22 is: “Automated individual decision-making, including profiling”. It states that: “The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.”
As for the processing of special categories of personal data, article 22 contains a general prohibition on processing it. A simple analysis of articles 9 and 22 leads to a conclusion that article 9 deals with a specific category of particularly sensitive personal data, whereas article 22 deals with how we process data and the way we make decisions related to it. This might include “regular” personal data, but also those which fall into a special category. Important to define its building blocks of article 22: Firstly, profiling and, secondly, automated individual decision making. In one of its recommendations, the Article 29 Working Party (A29WP) – who helped draft GDPR – also referred to Article 22, with regard to the numerous reactions from the scientific community. These reactions considered article 22 dubious since its first release. GDPR defines “profiling” as:
“any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyze or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behavior, location or movements”.
A few constitutive elements can be extracted from this definition. Namely: a) profiling must be automated, b) profiling must be performed on personal data, c) the goal of any profiling must be to evaluate certain personal aspects of the individual. The use of the verb “evaluate” suggests that profiling involves some form of assessment or judgement about a person. However, this may not always be the case. It is important to approach each case individually and assess whether the profiling occurs or not.
Automated individual decision making is a different concept which deserves its own explanation. A29WP define automated individual decision-making as the “ability to make decisions by technological means without human involvement.” Simply put, we allow artificial intelligence to make decisions and generate an output to achieve a specific purpose based on the processing of our personal data. Although profiling and automated individual decision-making are different concepts, they often go together. GDPR addresses the abovementioned concepts in following manner: (a) general profiling, (b) decision making based on profiling and (c) fully automated decision making, including profiling. The difference between (b) and (c) lies in profiling. The decision making is done by a human, whereas in case of wholly automated decision making, including profiling, the decision making is done by an artificial intelligence. In our opinion, introducing thermal cameras in the fight against the coronavirus will, most likely trigger article 22.
The pursuit of efficiency and growth by means of digital technology would not make sense if we are not allowed to use its full potential. Therefore, the application of thermal cameras, without subsequent software solutions that will lead to automated decision making and profiling would not present the most efficient means of data processing. Given that efficiency is the key to successfully defeating the pandemic, the course of action is obvious. As noted, the processing of personal data under article 22 is generally prohibited. However, paragraph 2 of the same article allows the processing of personal data if it is (a) “necessary for entering into, or performance of, a contract between the data subject and a data controller”, or (b) “is authorized by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests”. A third option is also permitted: (c) when it is “based on the data subject’s explicit consent.”
This processing set out in article 22 paragraph 2 refers to “regular” personal data. Regarding the special category of personal data, article 22 paragraphs 4 prescribes the following:
“decisions referred to in paragraph 2 shall not be based on special categories of personal data referred to in Article 9(1), unless point (a) or (g) of Article 9(2) applies and suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests are in place.”
Therefore, if data controllers want to process special category personal data that also includes profiling and automated decision making, they must meet certain additional requirements. They must protect the rights, freedoms and legitimate interests of their data subjects.
During the coronavirus pandemic there has been a possibility to meet all the above conditions because the fight against the pandemic is currently of supreme importance, and GDPR leaves the possibility to put the protection of legitimate interests before the rights of data subjects. This is an additional reason why GDPR article 22 will be activated. Suitable and specific measures for safeguarding data subjects’ fundamental rights, freedoms and legitimate interests are not the focus of this paper because they deserve a separate analysis in their own right. To conclude, it should be noted that processing a specific category of personal data under article 22 is possible if the conditions set out in article 9 paragraph 2 and article 22 paragraph 2 and 4 are met cumulatively.
The relation between article 22 of the GDPR and the right to explanation
GDPR prescribes a relatively sound and broad legal basis as it allows for the protection of personal data even in uncommon situations. However, the processing of personal data by thermal cameras and the activation of article 22 is not a problem in itself but in relation to the rights of data subjects provided by GDPR. Another area where this is evident is in relation to the so-called “right to explanation”. If we consider that artificial intelligence has repeatedly led to discrimination and violation of data subjects’ right to privacy – for instance through the “artificial intelligence bias” – we can see the importance of GDPR article 22 and why it should be introduced to the general public. There are regulations setting out the standard information a data controller must send a data subjects when processing their personal data – such as who the controller is and the purpose of processing. However, in addition to this, article 13 (2) (f) stipulates that the controller must provide additional information if
“the existence of automated decision-making, including profiling, referred to in Article 22(1) and (4) and, at least in those cases, meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject”.
This provision, together with article 22, has provoked a storm of reactions in the scientific community since its first publication.
Today machine learning goes so far that sometimes even computer scientists do not know how a computer came up with a certain solution. Originally, algorithmic solutions were based on certain rules and procedures that computers followed. The logic of algorithm action was explicit. Input followed a certain steps to arrive at a particular output. The introduction of the machine learning, however, changed the logic of the whole system. We now give a certain amount of data to the computer from which computer learns, and building off what it learnt a new output is generated. Therefore, the logic of the system in some situations tends to become implicit and difficult to explain even for experts.
A29WP tackled this problem in its recommendations. Explaining the logic itself by the data processor does not necessarily mean explaining complex algorithmic procedures to the data subjects, but providing meaningful, significant information in relation to the respondent. Additionally, complexity alone cannot be an excuse for not providing relevant information. From the perspective of the A29WP the whole issue seems somewhat more flexible compared to the original technical discussion. All rights prescribed by GDPR are directed towards data subject, and they are the center of the whole process. Therefore, the exercise of rights prescribed by GDPR should be viewed from the perspective of the data subject, and not from the perspective of the technological complexity. The significance of this to the present discussion is to take into account the high possibility that today’s technology could endanger our fundamental rights, without any human intervention in the process. In the absence of human intervention, measuring body temperature by means of machine learning and profiling can be potentially dangerous and discriminatory. For example, conventional thought considers that a “normal” body temperature is 37 degrees, while the temperature of a person infected with influenza virus is 38 degrees. This distinction arose in 1900 when little was known about thermoregulation, immunology, and microbiology. Today, body temperature is observed individually, and is always on a certain spectrum. By measuring temperature through thermal cameras, people can be categorised as infected when they are not and find themselves in inconvenient or even discriminatory situations. Although efficiency is important in the context of the current pandemic, certain guarantees for the protection of human rights and fundamental freedoms must exist. If efficiency is the only aim, human rights could easily be violated. Therefore, it is highly significant that the data controllers insure strong guarantees and possible explanations to citizens of the meaningful logic behind their data processing.
Using thermal cameras as a measure against the coronavirus necessarily involves the general digitalisation of our social interactions. Life with coronavirus and life after the pandemic will surely change our perception and thoughts about digital technologies. The fear of infection has changed our priorities in life.
GDPR provides a flexible legal basis that allows for processing personal data in a variety of circumstances. However, the mere existence of a normative framework does not lead to clear solutions. We can see this from the relationship of GDPR article 22 to the rights of data subjects, especially in the so-called “right to an explanation”. Thermal cameras and algorithmic solutions can lead to discrimination against individuals. Efficiency is necessary, but it is not the only goal. Therefore, in such circumstances, it is necessary to constantly raise general awareness and warn all relevant parties of the potential dangers of automated processing. Indeed, this is an obligation under GDPR. In sum, our fundamental human rights must be protected, regardless of the circumstances we find ourselves in.
Mihovil Granić is a partner at Žurić i Partneri, Zagreb, Croatia. Kristijan Antunović is a trainee at Žurić i Partneri.
Image, withplex, Pixabay licence.