Data Protection Online: Alternative Approaches to Sensitive Data?

AuthorRebecca Wong
PositionLaw Department University of Sheffield
Pages9-16

    A version of this paper was published in Complex 4/06 - Sylvia M. Kierkegaard (ed.): Legal, privacy and security issues in information technology

Key word: Sensitive data, data protection, online

Page 9

1. Introduction

The paper examines the concept of "sensitive data" as defined under the Data Protection Directive 95/46/EC (DPD) (hereinafter termed "DPD") and considers whether the categorisation of sensitive data should be amended in the light of technological developments. Under Art. 8(1) of the DPD, sensitive data is defined as 'personal data revealing racial origin, political opinions or religious or philosophical beliefs, trade union membership, and the processing of data concerning health or sex life.'

However, when Art. 8(1) of the DPD is applied on the internet, it is questionable whether the criterion works in practice. More specifically, the case of Lindqvist (C-101/01)1 demonstrates the problem of publishing personal information containing sensitive data on the World Wide Web. In this case, L had uploaded a web page containing details about members of a Parish Church. The website also included information about a member who had injured her foot. When the case was referred to the European Court of Justice (ECJ) for a preliminary ruling, the issue was whether the publication of a member who had injured her foot constituted the processing of "sensitive data" because this was data concerning the health of an individual. Leaving aside the exemptions under Art. 8(2) of the DPD, my main concern is the broad application of Art. 8 DPD to anything published on the web page, which directly or indirectly refers to anyone having a political opinion, religious or philosophical beliefs, trade union member or data concerning the health or sex life. Another area of because the processing of personal data revealing racial origin? To put it another way, if I had published a picture of an Eskimo, would I be processing sensitive data because the picture reveals the Eskimo's racial origin? Under the DPD, Art. 8(1) would apply irrespective of how trivial the case may be.

These are questions that are not easy to answer, but the paper will firstly consider the origins of sensitive data. This will be followed by the current approach adopted under the DPD and the Lindqvist2 case. I will then look at Professor Simitis's report (1999) entitled Revisiting sensitive data and consider the arguments in the light of the online environment. By online environment, I am excluding manual files containing personal data such as card indexes. I am referring specifically to the internet and the World Wide Web.

Page 10

2. Origins Of Sensitive Data

Sensitive data was originally introduced under the Council of Europe Convention 1981 on Personal Data (hereinafter termed "CoE Convention"). Art. 6 of the CoE3 Convention provides that Personal data revealing racial origin, political opinions or religious or other beliefs, as well as personal data concerning health or sexual life, may not be processed automatically unless domestic law provides appropriate safeguards. The same shall apply to personal data relating to criminal convictions.

The CoE 1981 and the OECD Guidelines on Personal Data 1980 (the latter is not discussed here) have been influential in the developments leading up to the DPD. Furthermore, these international instruments have been a model to some countries enacting data protection laws (Bygrave, 2003).

According to Simitis (1999), the categorisation was readily accepted without question within the DPD. Although member states of the European Economic Area have implemented the DPD including Art. 8(1)4 on sensitive data, the question that has arisen is whether the categorisation falls short of the dangers highlighted in recent technological developments? To give an example, the Council of Europe report (entitled Informational self-determination in the internet era) recommended that identification numbers that enable many databases or data to be connected together should be included within the definition of "sensitive data". This practice has become widespread in the public and private sector. The DPD however, leaves it to the discretion of the member state to determine the conditions under which a national identification number or any other identifier of general application may be processed (Art. 8(7) DPD), but does not specifically touch on the subject of identification numbers in the online environment or its use in databases.

Secondly, the category should also cover "profiling". This is defined in Swiss law as 'a combination of data that enable an aspect of the key aspects of the personality of an individual to be made.' (CoE Report, 2005). The report suggested that anonymous profiling should be included when this is used to take subsequent decisions concerning persons covered under this profile. It is interesting to note from the report that the current definition of sensitive data is too wide and we should abandon the approach based on the definition of the actual nature of data in favour of a purpose-based approach. To put it another way, what is the purpose of such processing? Is the processing intended to reveal sensitive data such as political opinions? This alternative would not only be pragmatic, but also resolve the difficulties highlighted in Lindqvist,5 where any personal information published on the web could theoretically fall within Art. 8(1) involving the processing of sensitive data. This is a slightly different from the contextualised approach that I will be exploring later. What I mean by contextualised approach is that personal data becomes sensitive according to its context as argued by Simitis (1999). Before examining both approaches, I want to examine the case of Lindqvist6 in the next section looking at the implications of the European Court of Justice's decision.

3. The Implications Of Lindqvist

I have already discussed the facts of the Lindqvist case above and do not want to reiterate this here, so I want to consider the main issues under this case. The European Court of Justice (hereinafter "ECJ") had to decide: Whether the act of referring, on an internet page, to various persons and identifying them by name or by other means, for instance by giving their telephone number or information regarding their working conditions and hobbies, constitutes the processing of personal data wholly or partly by automatic means within the meaning of Article 3(1) of Directive 95/46?

If the answer to the first question is in the affirmative, was the processing of personal data such as that described in the first question covered by one of the exceptions in Article 3(2) of Directive 95/46?

Whether reference to the fact that an individual had injured her foot and is on half-time on medical grounds constitutes personal data concerning health within the meaning of Article 8(1) of Directive 95/46?

Page 11

Finally, the question is whether any transfer [of data] to a third country within the meaning of Article 25 of Directive 95/46 where an individual in a Member State loads personal data onto an internet page which is stored on an internet site on which the page can be consulted and which is hosted by a natural or legal person (the hosting provider) who is established in that State or in another Member State, thereby making those data accessible to anyone who connects to the internet, including people in a third country? The referring court also asks whether the reply to that question would be the same if no one from the third country had in fact accessed the data or if the server where the page was stored was physically in a third country.

For the purpose of this paper, the relevant issue is the ECJ's decision to...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT