Showing posts with label data retention. Show all posts
Showing posts with label data retention. Show all posts

Thursday, 14 September 2017

The Privacy International case in the IPT: respecting the right to privacy?



Matthew White, PhD candidate at Sheffield Hallam University.

Introduction

On 21 December 2016, the Grand Chamber (GC) of the Court of Justice of the European Union (CJEU) in Cases C-203/15 and C-698/15 Tele2 and Watson ruled that blanket indiscriminate data retention was incompatible with European Union (EU) law. With that judgment, Professor Lorna Woods highlighted that this did not mean that the CJEU’s interpretation of the requirements of the Charter of Fundamental Rights (CFR) was ‘limited only to this set of surveillance measures.’ Hence, on 9 September 2017, the Investigatory Powers Tribunal (IPT) in Privacy International v the Secretary of State for Foreign and Commonwealth Affairs and Others handed down a judgment regarding the lawfulness under EU law of the acquisition and use of Bulk Communications Data (BCD) under s.94 of the Telecommunications Act 1984 (TA 1984) [4], including a request to the CJEU to answer further questions on EU law. This blog post concerns itself not with the preliminary reference itself, but the underlying flawed logic of the IPT’s reasoning with regards to fundamental rights protection.

The IPT’s faulty premise plagues its judgement from the beginning

The IPT highlighted that the issue before them was the balance between steps taken by the State, through Security & Intelligence Agencies (SIAs) and to ‘protect its population against terror and threat to life against the protection of privacy of the individual’ [6]. The premise of the IPT is deeply flawed from the outset thus impacting upon its reasoning. Daniel Solove has highlighted that ‘protecting the privacy of the individual seems extravagant when weighed against the interests of society as a whole’ (Daniel Solove, (2009) Understanding Privacy, Harvard University Press, p89). When privacy is confined to individualistic notions (particularly of ‘bad guys’), the argument for the departure of its protection becomes easier to justify, no less when that justification is protecting an entire nation.

Privacy is not just an Individual Right

Many (including Solove) have argued that privacy has a common, public and/or social value (Priscilla M. Regan, Legislating Privacy, Technology, Social Values and Public Policy, The University of North Carolina Press, 1995; Kirsty Hughes, ‘The social value of privacy, the value of privacy to society and human rights discourse’ in Beate Roessler and Dorota Mokrosinska (eds), Social Dimensions of Privacy Interdisciplinary Perspectives (Cambridge University Press). Privacy is a prerequisite for liberal democracies because it sets limits on surveillance by acting as a shield for groups and individuals (Alan F. Westin, Privacy and Freedom, New York: Atheneum (1967), p24). It is also important in that, in terms of voter autonomy and its attraction of talented people to public office (Hughes, p228-229). Privacy is also important for social relations (ibid, p229), even more so in that privacy invasive technologies can affect social life more generally (Beate Roessler and Dorota Mokrosinska, p2). A failure to protect social relations, is a failure to protect the democratic state (Francesca Malloggi. “The Value of Privacy for Social Relationships.” Social Epistemology Review and Reply Collective 6, no. 2 (2017): 68-77, p70).

These Powers do NOT just affect Individuals

Another problem with the IPT’s premise is that to argue that such measures as BCD acquisition/use only affect an individual’s privacy is simply not true. It should be obvious by the very name and nature of the powers that they are not targeted on individuals (para 2.1), something which the Respondents in Privacy International even attested to [9(ii)]. The draft BCD Code of Practice under the Investigatory Powers Act 2016 (IPA 2016) notes that ‘if the requirements of this chapter are met then the acquisition of all communications data generated by a particular CSP (Communications Service Provider e.g. BT, Google, iCloud) could, in principle, be lawfully authorised’ (para 3.5). Thus, any suggestion that the issue at hand only concerns an individual is palpably false. As the Grand Chamber (GC) of the European Court of Human Rights (ECtHR) in S and Marper v United Kingdom noted that the:

[M]ere storing of data relating to the private life of an individual amounts to an interference within the meaning [of Article 8]…subsequent use of the stored information has no bearing on that finding [67]. 

Due to the nature of the BCD powers, to say they only affect the individual is to ignore the reality of such sweeping powers which constitute mass interference of a ‘substantial portion, or even all of the relevant population’ [256] and do have chilling effects on totally innocent people (Rozemarijn van der Hilst, (2009), ‘Human Rights Risks of Selected Detection Technologies Sample Uses by Governments of Selected Detection Technologies’ p20; German Forsa Institute, Meinungen der Bunderburger zur Vorratsdatanspeicherung, 28 May 2008). Just like blanket data retention, BCD acquisition/use would ‘relate to all communications effected by all users, without requiring any connection whatsoever with’ [180] national security. 

Article 8 is not limited to Privacy

As ‘private life’ in Article 8 of the European Convention on Human Rights (ECHR) is not susceptible to exhaustive definition [66], this means that the notion is much wider than that of privacy (p12). This encompasses a sphere within which every individual can freely develop and fulfil his personality, both in relation to others and with the outside world (ibid). Private life also includes one’s physical and psychological integrity [58], autonomy [ibid] as well as a right to a form of informational self-determination [137], physical, social [159] and ethnic identity [58], professional activities [29], a certain degree of anonymity [42] and the protection of personal data (S and Marper, [103]).

This does not even begin to consider how such concepts overlap (p10-11). Nor is Article 8 limited to private life, as ‘correspondence’ [44] and the potential for ‘home’ [41] and family life (p21) (even more so now under the new regime of the IPA 2016 in light of the Internet of Things etc) are equally important in the surveillance context. The measure ‘strikes at freedom of communication between users of the postal and telecommunication services [41] because we increasingly use the internet to ‘establish and support personal relationships, bank, shop, to gather the news, to decide where to go on holiday, to concerts, museums or football matches. Some use it for education and for religious observance – checking the times and dates of festivals or details of dietary rules.’ Very few aspects of our lives are untouched by the internet (Paul Bernal, ‘Data gathering, surveillance and human rights: recasting the debate’ (2016) Journal of Cyber Policy, 1:2 243, p247).

Correspondence becomes particularly important when it affects legal professional privilege (LPP) and journalistic sources. This was a criticism of data retention laws in that it did not provide any exceptions for professional secrecy (Tele2 and Watson, [105]). The ECtHR in Kopp v Switzerland noted that Swiss law violated Article 8 because it provided ‘no guidance on how authorities should distinguish between protected and unprotected attorney-client communications’ [73-75]. BCD acquisition/use suffers from the same drawbacks.  

Thus, when the IPT refers merely to individual privacy, it does so without acknowledging the breadth and multifaceted nature of Article 8, or how surveillance measures impact on them in various ways, which limits their ability to give a thorough assessment resulting in a possible divergence from the ECtHR.

Confining the discussion to Privacy foregoes the broader context of Fundamental Rights Protection

[i]t is hard to imagine, for example, being able to enjoy freedom of expression, freedom of association, or freedom of religion without an accompanying right to privacy (Benjamin J. Goold, ‘Surveillance and the Political Value of Privacy’ (2009) 1:4 Amsterdam Law Forum 3, p4).

When Article 8 is confined to the narrow aspect of the privacy of a suspected terrorist, not only does it overlook the breadth of Article 8 (mentioned above) but it does not even entertain other fundamental rights that might be at stake. This is also a view the then Independent Reviewer of terrorism legislation, David Anderson acknowledged (para 2.12) and Paul Bernal (Paul Bernal). The CJEU were also aware of this to some degree in Tele2 and Watson where they noted that the data retention could have an effect on the use of means of electronic communication and, consequently, on the exercise by the users thereof of their freedom of expression, guaranteed in Article 11 of the CFR [101], which is essentially equivalent to Article 10 ECHR.

Article 10 ECHR: Freedom of Expression

Article 10 applies to communications via the internet [34] (in French), regardless of the message conveyed [55] and irrespective of its nature [47]. The ECtHR regards freedom of expression as constituting ‘one of the essential foundations of a democratic society and one of the basic conditions for its progress and for each individual’s self-fulfilment [100]. Not only does this highlight freedom of expressions value to democracy, it highlights one of the various ways in which Article 10 interplays with Article 8 i.e. self-development [117].

Another way in which Articles 10 and 8 interlink is that of anonymity, where Lord Neuberger noted that in the context of anonymous speech, Article 8 reinforces Article 10 (para 25). Within that context, Neuberger continued that Article 8 rights are of fundamental importance (ibid, para 42). Political reporting and investigative journalism attract a high level of protection under Article 10 [129]. The then Special Rapporteur for the United Nations of freedom of expression, Frank La Rue highlighted that that restrictions on anonymity can have a chilling effect, which dissuades the free expression of information and ideas (para 49).

Article 9 ECHR: Freedom of Religion, Thought and Conscience

Like Article 10, Article 9 is regarded as one of the foundations of a democracy [34]. Article 9 entails the freedom to manifest one’s religion can be done in public or in private [78]. It also includes the absolute and unconditional right to hold a belief [ibid, 79]. The right to manifest one’s belief has a negative aspect, in that an individual has a ‘right not to be obliged to disclose his or her religion or beliefs and not to be obliged to act in such a way that it is possible to conclude that he or she holds’ [41]. BCD acquisition/use makes this entirely possible (para 1.1), causing a notable chilling effect.

Article 11 ECHR: Freedom of Association/Assembly

The GC of the ECtHR has referred to freedom of assembly, like Article 9 and 10 as one of the foundations of a democratic society [91]. Similarly, freedom of association is of utmost importance because it ‘enables individuals to protect their rights and interests in alliance with others’ (p4). The Steering Committee on Media and Information Society (Sterling Committee) in their to human rights for Internet users when referring to Article 11 noted that users have ‘the right to peacefully assemble and associate with others using the Internet’ (para 61). Just as noted above with other Convention Rights, surveillance has harmful effects on freedom of association (see also Valerie Aston, ‘State surveillance of protest and the rights to privacy and freedom of assembly: a comparison of judicial and protester perspectives’ (2017) EJLT 8:1).

The enjoyments of the rights contained in Articles 9-11, which are foundations for democracy (especially online) are underpinned by Article 8.

How the premise impacts upon the IPT’s reasoning

Given the above mentioned, it is important to discuss how the lack of consideration for the potential effects on other fundamental rights affects the IPT’s reasoning.

It’s not all about Utility

The IPT discussed the evidence for supporting BCD acquisition/use, ranging from Anderson’s report, the case studies within them, Mi5 witness statements (Privacy International, [11-17]). The IPT makes reference to the critical value of BCD acquisition/use and the need for the haystack, in order to find the needle. A quick counter to the second point is ‘[i]f you’re looking for a needle in a haystack, how does it help to add hay?’ The problem with the needle in the haystack argument is that it could be used to justify any amounts of data to be stored/used, even all that is available.

Furthermore, this part of the judgement concerns what the IPT considers to be ‘The Facts’ yet on closer examination, not everything highlighted by the IPT are facts. For example, the IPT refers to the Respondents’ witnesses speaking persuasively and refers to an Mi5 witness. If the IPT were to regard witness statements as facts, then for example, Bruce Schneier’s, or former National Security Agency (NSA) official William Binney’s denunciation of mass surveillance should be given equal weight. There is no suggestion that this is what was (or should have been) presented before the IPT, but it highlights the weight given to opinions by the IPT. Discussing only the evidence of the Respondent also demonstrates the problematic information asymmetry in the surveillance context where:

[I]nformation asymmetrification provides a foundation on which the existence of elites is built and possibilities of strengthening that asymmetry will be enthusiastically sought (Geoffrey Lightfoot and Tomasz Piotr Wisniewski, ‘Information asymmetry and power in a surveillance society’ (2014) Information and Organization 24 214–235, p230).

Regarding the first point, the value of a measure does not necessarily make it necessary [48]. The IPT considers that although BCD acquisition/use is essential, this does not completely resolve the question of proportionality (Privacy International, [16]). Lord Kerr in his dissenting opinion in Beghal v DPP quite rightly noted that ‘powers which can be used in an arbitrary or discriminatory way are not transformed to a condition of legality simply because they are of proven utility’ [93]. Although the IPT did find s.94 not to be compliant with Article 8 prior to its avowal, this follows a trend of watering down the prescribed by/in accordance with law requirements noted in Kennedy v United Kingdom in where for the IPT, honesty appears to be synonymous with legality.

Moreover, the supporting evidence for BCD acquisition/use does not refer to what type of communications data was used, how it was used, or why it was key. The IPT noted that nothing in the evidence they examined contradicts what was set out in paragraphs 11-16. This is problematic for two reasons, if the IPT only considered evidence from the Respondent, then it would make sense that there is less likelihood that evidence presented would contradict arguments put forward, and thus becomes a one-sided argument. Secondly, as Bruce Schneier noted ‘no method of surveillance or inquiry will ever stop a lone gunman.’ Although, the murder of Fusilier Lee Rigby involved two assailants, the Intelligence and Security Committee (ISC) noted that Mi5 ‘put significant effort into investigating [Michael Adebolajo] and employed a broad range of intrusive techniques. None of these revealed any evidence of attack planning.’ What this demonstrates is the contrary view that all the surveillance in the world did not prevent individuals ‘such as the Fort Hood shooter, or Anders Behring Breivik, or the Charlie Hebdo attackers.’ Therefore, the IPT draws attention to its obscured view given that it has inquisitorial powers (s.68(2)(b) of the Regulation of Investigatory Powers Act 2000 (RIPA 2000)) and could have sought information regarding counter arguments.

No Genuine Intrusion?

When the IPT discussed the operation of s.94 TA 1984, they noted that access to BCD is either targeted or more likely to involve electronic trawling of masses of data which are not ‘read’ to find the needle in the haystack (Privacy International, [19]). The IPT continues that a ‘miniscule quantity of the data trawled is ever examined. There is thus no genuine intrusion to any save that miniscule proportion’ (ibid). This reasoning of the IPT is almost as if the UK exists in a vacuum when it comes to the findings of the GC in S and Marper. The IPT’s reasoning is that only when communications data is accessed/examined, then follows genuine intrusion. This is why confining the issue to privacy proves problematic because the GC in S and Marper noted that the protection of personal data is of fundamental importance to the enjoyment of private and family life. This protection begins as soon as the data is processed and retained, thus marks the genesis of genuine intrusion, any subsequent use has no bearing on this. The IPT’s reasoning follows the sentient being argument which suggests that privacy is only interfered with when private data is read by an intelligence officer. Following this argument would lead to the logical conclusion of sowing the seeds of the total destruction of private life and data protection as surveillance becomes increasingly automated e.g. by analogy automatic number plate recognition (ANPR) [169-170], see also CJEU Opinion on PNR [121-132]. Using last century’s arguments (if one could even call it that) are not suitable today.

The IPT maintains the approach of significantly downplaying the severity of interference caused by storing and using communications data. The IPT had previously accepted a false analogy from the Respondent of equating GPS data (a particular type of communications data) with communications data in general to argue that it is not as serious as interception (Matthew White, ‘Protection by Judicial Oversight, or an Oversight in Protection?’ (2017) Journal of Information Rights, Policy and Practice 2:1, p9). This was argued that when giving weight to this position:

[I]t did so by considering a case of an isolated specific type of data, which cannot be used to justify an argument that interference is less severe whilst ignoring the cumulative total of the different types of communications data (ibid).

Malte Spitz of the German Green party published data that was retained under Germany’s data retention laws in which Zeit Online created an interactive map detailing Spitz’s movements. Biermann continued that this data revealed:

[W]hen Spitz walked down the street, when he took a train, when he was in an airplane. It shows where he was in the cities he visited. It shows when he worked and when he slept, when he could be reached by phone and when was unavailable. It shows when he preferred to talk on his phone and when he preferred to send a text message. It shows which beer gardens he liked to visit in his free time. All in all, it reveals an entire life.

Advocate General (AG) Saugmandsgaard Øe in Tele2 and Watson noted that that in the individual context a general data retention obligation would facilitate equally serious interference as targeted surveillance measures, including those which intercept the content of communications [254]. AG Saugmandsgaard Øe continued that the risks associated with access to communications data ‘may be as great or even greater than those arising from access to the content of communications’ [259]. For example, replying to an email saying ‘lmao’ my not reveal much to an observer, but the observer could learn what email address the message was sent from and to, the time and date that message was sent, the location of when it was sent, what browser was being used and what device was being used etc. This simple analogy demonstrates why yet again the IPT are incorrect to downplay the revealing nature of communications data given that people get killed based on it. This seriousness only intensifies when the acquisition/use is in bulk.

Powerful Submissions?

The IPT highlighted the powerful submissions (hence very persuasive (Privacy International, [51])) made by the Respondent:

The use of bulk acquisition and automated processing produces less intrusion than other means of obtaining information.
The balance between privacy and the protection of public safety is not and should not be equal. Privacy is important and abuse must be avoided by proper safeguards, but protection of the public is preeminent.
The existence of intrusion as a result of electronic searching must not be overstated, and indeed must be understood to be minimal.
There is no evidence of inhibition upon, or discouragement of, the lawful use of telephonic communication. Indeed the reverse is the case.
Requirements or safeguards are necessary but must not, as the Respondents put it, eviscerate or cripple public protection, particularly at a time of high threat [50].

It is important to deal with these points individually (some of which are already dealt with above).

The Respondents maintain that BCD acquisition/use is less intrusive than other methods of gathering information without explaining what other methods are more intrusive or why and why this is the least restrictive measure to obtain the objective [260].

As noted above, this is not just an issue of narrow privacy, but an issue of other applicable fundamental rights protected by the ECHR. The premise of the balance between privacy and public safety i.e. security is a miscast (Paul Bernal, p244), misleading (ibid) and false (see here, here and here) one to begin with. It ignores factors that demands for security can actually reduce security therefore, safety (Paul Bernal, p224; Harold Abelson et al, Keys under doormats: mandating insecurity by requiring government access to all data and communications. Journal of Cybersecurity, 2015, 1–11, p5) and otherwise prove ineffectual (see here and here). It also suggests that privacy should always be on the back foot when the issue concerns the protection of the public, when the irony is that it’s the publics’ data that is being acquired and used (see social dimension of privacy above which protects against utilitarian calculation of majoritarian societal interests and/or political whims (Kirsty Hughes, p 227)). It also assumes that when Convention Rights are a stake, the only question that needs to be answered is whether the appropriate balance has been struck, forgoing legality and necessity.

These types of arguments would seemingly fall into the narrow nothing-to-hide-like argument that looks for singular type of injury, be it some grave physical violence, a loss of substantial money or something severely embarrassing (Daniel Solove. Nothing to Hide: The False Tradeoff between Privacy and Security (2011). Yale University Press, p29). This of course also ignores both European Courts on the severity of the mere storage of data interfering with private/family life/freedom of expression/association [107] and data protection.

Contrary to what the Respondents assert, there is evidence for chilling effects due to surveillance measures, some highlighted above. Moreover, assessing chilling effects should not just be measured by inhibitions, but actual methods of protecting online activity. There was An increase in Virtual Private Network (VPN) (this essentially aims hide online activity) subscriptions in Australia when their national data retention laws came into force and in the UK when the IPA 2016 and Digital Economy Act 2017 (DEA 2017) were in passing. Or by the increasing the use of ad blockers, which 11 million devices in the UK now have. As Edward Snowden revealed ‘government surveillance efforts are sometimes bolstered by online advertising practices.’ Moreover, Solove contends that the value of protecting against chilling effects is not measured simply by its effects on individuals exercising their rights, but its harms to society because among other things ‘they reduce the range of viewpoints expressed and the degree of freedom with which to engage in political activity’ (Daniel J. Solove, ‘’I’ve Got Nothing to Hide’ and Other Misunderstandings of Privacy’ (2007) San Diego Law Review 44 745, p746). It is true that the uptake in technology has increased e.g. smartphones but this does not necessarily disprove the idea of chilling effects etc. This is due to ignoring the fact that many may not be fully aware of what information is being collected (Sandra Braman, 2006, Tactical memory: The politics of openness in the construction of memory Sandra Braman. First Monday, 11(7); Connor Sheridan, (2016) "Foucault, Power and the Modern Panopticon". Senior Theses, Trinity College, Hartford, CT 2016. Trinity College Digital Repository, p48; Majority of Brits Unaware of Online Surveillance) where awareness leaves open the possibility of resistance (Andrew Roberts, Privacy, Data Retention and Domination: Digital Rights Ireland Ltd v Minister for Communications, (2015) 78(3) MLR 522–548, p545). This resistance could be not using the technology, to finding ways to circumvent surveillance law (self-regulatory), protests (Hintz, A. & Dencik, L. (2016). The politics of surveillance policy: UK regulatory dynamics after Snowden. Internet Policy Review, 5(3), p8) (political), or legal action all designed to protect fundamental rights.

This is the ‘the ends justify the means’ justification. Not every interference or derogation from the principle of protection of fundamental rights are necessary in a democratic society.

Prior Authorisation
The IPT noted that Secretary of State authorisations complied with the ECHR for reasons set out in a prior judgment. The IPT were of the opinion that the ECtHR in Szabo & Vissy v Hungary were not recommending any new safeguards because Hungarian law fell below even existing principles [60]. This of course does not consider cases such as Dumitru Popescu v Romania [71-73], Iordachi and Others v Moldova [40], and Uzun v Germany [72] all endorsing the view that the body issuing authorisations for interception should be independent and that there must be either judicial control or control by an independent body over the issuing body's activity.

So, when the ECtHR in Szabo endorses the view in Iordachi that ‘control by an independent body, normally a judge with special expertise, should be the rule and substitute solutions the exception, warranting close scrutiny’ [77] it is difficult to suggest the ECtHR in Szabo were not strongly advocating for prior judicial control (Matthew White, p15). The ECtHR did acknowledge that post factum oversight may counterbalance the short comings of initial oversight (referring to the IPT in Kennedy) (Szabo, [77]). However, it has already been argued that this counterbalance is not adequate (Matthew White, p14-16).

Notification

According to the IPT, a requirement of notification is inadequate in the circumstances of national security because (a) national security is ongoing and (b) it relates to further operations and methodologies (Privacy International, [62]). The IPT also noted that this is not required for compliance with the ECHR [63]. This, however, overlooks Association for European Integration and Human Rights and Ekimdzhiev v Bulgaria where the ECtHR found violations of Article 8 and 13 (effective remedy) for among other things, a lack of a notification procedure [94] and [103]. Yet Ekimdzhiev concerned national security and the ECtHR even referred to the notification in the national security context in Germany for both individual (Klass v Germany, [11] and general surveillance measures (Weber and Saravia v Germany, [51-54] and in Leander v Sweden [31]). This is permissible due to the ECtHR establishing the principle that:

[A]s soon as notification can be made without jeopardising the purpose of the surveillance after its termination, information should be provided to the persons concerned (Ekimdzhiev, [90]).

This establishes that to the ECtHR’s mind, notification in the national security context is not inappropriate or inadequate considering this has been the practice of Germany for decades. Furthermore, the ECtHR acknowledge that it would not be desirable in all circumstances to notify, therefore leaving that possibility open whereas the IPT would prefer it kept shut. Also, in the national security context, the GC of the ECtHR in Roman Zakharov v Russia noted that notification was inextricably linked ‘to the effectiveness of remedies before the courts and hence to the existence of effective safeguards against the abuse of monitoring powers’ [234]. A point in which Paul de Hert and Franziska Boehm share.

Although the GC referred to the alternative to notification of the UK system i.e. the IPT jurisdiction (Roman Zakharov, [234]), de Hert and Boehm have questioned whether Kennedy ‘is capable of responding to the challenges arising out of the use of new surveillance techniques’ (Franziska Boehm and Paul de Hert, The rights of notification after surveillance is over: ready for recognition? (Yearbook of the Digital Enlightenment Forum, IOS Press 2012), pp. 19-39, p37). Boehm and de Hert continue that in light of powers such as data retention and ‘fishing expeditions’ that target a greater number of people without suspicion, a notification duty appears to be an effective tool to prevent abuse (ibid, p37-8). Finally, Boehm and de Hert note that the Belgian Constitutional Court has now adopted the notification principle as a requirement to comply with Article 8 (ibid, p38). The IPT highlights difficulties with the notification of BCD acquisition/use as to whether notification should be to everyone whose data is in the database, those subject to an electronic search or all those who feature in data in targeted access (Privacy International, [64])? Accepting this premise would accept the powers that are exercised to begin with, which is at the heart of this issue.

Conclusions: Be careful what you wish for

Ultimately, the IPT referred the question as to whether the Tele2 and Watson requirements apply in the national security context to the CJEU (ibid, [72]). This blog post has argued that much of the IPT’s reasoning with regards to fundamental rights protection is lacking. By confining itself to a restrictive notion of individual privacy of a person of interest, the IPT blinds itself to the broader notions of Article 8 and the other fundamental rights it underpins. Some aspects of the IPT’s reasoning (and Respondent’s arguments) is not even consistent with the very human rights system (ECHR) the Respondents are seeking to rely upon. The ECtHR have firmly noted that:

Given the technological advances since the Klass and Others case, the potential interferences with email, mobile phone and Internet services as well as those of mass surveillance attract the Convention protection of private life even more acutely (Szabo, [53]).

The GC in Roman Zakharov found that Russian law to be in violation of Article 8 because interferences with privacy rights were ordered ‘haphazardly, irregularly or without due and proper consideration’ (Roman Zakharov, [267]) in the national security context. Judge Pinto de Albuquerque noted that Roman Zakharov was a rebuke of ‘strategic surveillance’ (Szabo, Concurring Opinion of Judge Pinto de Albuquerque, [35]) which would accord a previous concurring opinion of judge Pettiti in which surveillance should not be used for ‘fishing’ exercises to bring in information (Kopp). If as the IPT say that a ‘miniscule quantity of the data trawled is ever examined’ how would this square with the position of ‘[t]he automatic storage for six months of clearly irrelevant data cannot be considered justified under Article 8’ (Roman Zakharov, [255])? Time will tell if the ECtHR follows this trend in Big Brother Watch and Others v UK, Bureau of Investigative Journalism and Alice Ross v UK and 10 Human Rights Organisations v UK. Therefore, the IPT should not convince itself of the ‘illusory conviction that global surveillance is the deus ex machina capable of combating the scourge of global terrorism’ (Szabo, Concurring Opinion of Judge Pinto de Albuquerque, [20]). Surveillance has never just been an issue of privacy, or private life or else the ECtHR would never have uttered its awareness:

[O]f the danger such a law poses of undermining or even destroying democracy on the ground of defending it, affirms that the Contracting States may not, in the name of the struggle against espionage and terrorism, adopt whatever measures they deem appropriate (Klass, [49]).

Barnard & Peers: chapter 9

Photo credit: Pixabay

Tuesday, 10 January 2017

A Threat to Human Rights? The new e-Privacy Regulation and some thoughts on Tele2 and Watson




Matthew White, Ph.D candidate, Sheffield Hallam University

Introduction

In a follow-up to last Christmas’s post, on 10 January 2017, the European Commission released the official version of the proposed Regulation on Privacy and Electronic Communications (e-Privacy Regs). Just as the last post concerned the particular aspect of data retention, this post will too.

Just as the former leaked version maintained, the proposal does not include any specific provisions in the field of data retention (para 1.3). This paragraph continues that Member States are free to keep or create national data retention laws, provided that they are ‘targeted’ and that they comply with European Union (EU) taking into account the case-law of the Court of Justice of the European Union (CJEU) and its interpretation of the e-Privacy Directive and the Charter of Fundamental Rights (CFR). Regarding the CJEU’s interpretation, the proposals specifically refers to Joined Cases C-293/12 and C-594/12 Digital Rights Ireland and Seitlinger and Others, and Joined Cases C-203/15 and C-698/15 Tele2 Sverige AB and Secretary of State for the Home Department. Aspects of the latter case is the focus of this post; the case itself has been thoroughly discussed by Professor Lorna Woods.

So, when is the essence of the right adversely affected?

Before discussing certain aspects of Tele2 and Watson, it is first important to draw attention to the provision which enables data retention in the new e-Privacy Regs. Article 11 allows the EU or its Member States to restrict the rights contained in Articles 5-8 (confidentiality of communications, permissions on processing, storage and erasure of electronic communications data and protection of information stored in and related to end-users’ terminal equipment). From Article 11, it is clear that this can include data retention obligations, so long as they respect the essence of the right and are necessary, appropriate and proportionate. In Tele2 and Watson the CJEU noted that any limitation of rights recognised by the CFR must respect the essence of said rights [94]. The CJEU accepted the Advocate General (AG)’s Opinion that data retention creates an equally serious interference as interception and that the risks associated with the access to communications maybe greater than access to the content of communications [99]. Yet the CJEU were reluctant to hold that data retention (and access to) adversely affects the essence of those rights [101]. This appears to highlight a problem in the CJEU’s reasoning, if the CJEU, like the AG accept that retention of and access to communications data is at least on par with access to the content, it makes little sense to then be reluctant to hold that data retention adversely affects the essence of those rights. The CJEU does so without making any distinction or reasoning for this differential treatment, and thus serves to highlight that perhaps the CJEU themselves do not fully respect the essence of those rights in the context of data retention.

The CJEU’s answer seems only limited catch all powers

The thrust of the CJEU’s judgment in Tele2 and Watson was that general and indiscriminate data retention obligations are prohibited at an EU level. But as I have highlighted previously, the CJEU’s answer was only in response to a very broad question from Sweden, which asked was:

[A] general obligation to retain traffic data covering all persons, all means of electronic communication and all traffic data without any distinctions, limitations or exceptions for the purpose of combating crime…compatible with [EU law]?

Therefore, provided that national laws do not provide for the capturing of all data of all subscribers and users for all services in one fell swoop, this may be argued to be compatible with EU law. Both the e-Privacy Regs and the CJEU refer to ‘targeted’ retention [108, 113]. The CJEU gave an example of geographical criterions for retention in which David Anderson Q.C. asks whether the CJEU meant that ‘it could be acceptable to perform “general and indiscriminate retention” of data generated by persons living in a particular town, or housing estate, whereas it would not be acceptable to retain the data of persons living elsewhere? This is entirely possible given the reference from Sweden and the answer from the CJEU. In essence the CJEU have permitted discriminatory general and indiscriminate data retention which would in any event respect the essence of those rights.

Data retention is our cake, and only we can eat it

A final point on Tele2 and Watson was that the CJEU held that national laws on data retention are within the scope of EU law [81]. This by itself may not raise any concerns about protecting fundamental rights, but it is what the CJEU rules later on in the judgment that may be of concern. The CJEU held that the interpretation of the e-Privacy Directive (and therefore national Member State data retention laws) “must be undertaken solely in the light of the fundamental rights guaranteed by the Charter” [128]. The CJEU has seemingly given itself exclusive competence to determine how rights are best protected in the field of data retention. It is clear from the subsequent paragraph that the CJEU seeks to protect the autonomy of EU law above anything else, even fundamental rights [129]. This is despite the ECHR forming general principles of EU law and is mentioned in Article 15(1) (refers Article 6(3) of the Treaty of the European Union (TEU) specifically referring to the ECHR as such). Article 11 of the e-Privacy Regs refers to restrictions respecting the ‘essence of fundamental rights and freedoms’ and only time will tell whether the CJEU would interpret this as only referring to the CFR. Recital 27 of the e-Privacy Regs just like Recital 10 and 30 of the e-Privacy Directive refers to compliance with the ECHR, but as highlighted previously, Recitals are not legally binding.

Is the CJEU assuming too much?

A further concern, is that had the European Commission added general principles of EU law into Article 11, the CJEU may simply have ignored it, just as it has done in Tele2 and Watson. The problem with the CJEU’s approach is that it assumes that this judgment offers an adequate protection of human rights in this context. The ECHR has always been the minimum floor, but it appears the CJEU wants the CFR to be the ceiling whether it be national human rights protection, or protection guaranteed by the ECHR. What if that ceiling is lower than the floor? The AG in Tele2 and Watson stressed that the CFR must never be inferior to the ECHR [141]. But I have argued before, the EU jurisprudence on data retention is just that, offering inferior protection to the ECHR, and the qualification by the CJEU in Tele2 and Watson does not alter this. This position is strengthened by Judge Pinto De Albuquerque in his concurring opinion in the European Court of Human Rights judgment in Szabo. He believed that:

[M]andatory third-party data retention, whereby Governments require telephone companies and Internet service providers to store metadata about their customers’ communications and location for subsequent law-enforcement and intelligence agency access, appeared neither necessary nor proportionate [6].

Of course, Judge Pinto De Albuquerque could have been referring to the type of third party data retention which requires Internet Service Providers (ISPs) to intercept data from Over The Top (OTT) services, but his description is more in line with data retention of services’ own users and subscribers.

Conclusions

Although the CJEU has prohibited general indiscriminate data retention, the CJEU does not seem to have prevented targeted indiscriminate data retention. If the European Court of Human Rights (ECtHR) were to ever rule on data retention and follow its jurisprudence and the opinion of Judge Pinto De Albuquerque, this may put EU law in violation of the ECHR. This would ultimately put Member States in a damned if they do, damned if they do not situation, comply with the ECHR, and violate EU law autonomy; comply with EU law and violate the ECHR. When the minimum standards of human rights protection in this context are not adhered to, because of EU law, the ECHR should prevail. As anything less is a threat to human rights, meaning that the (even if well intentioned) CJEU can also be.

JHA4: chapter II:7

Photo credit: goldenfrog.com

Wednesday, 21 December 2016

Data retention and national law: the ECJ ruling in Joined Cases C-203/15 and C-698/15 Tele2 and Watson (Grand Chamber)




Lorna Woods, Professor of Internet Law, University of Essex

Introduction

Today's judgment in these important cases concerns the acceptability from a human rights perspective of national data retention legislation maintained even after the striking down of the Data Retention Directive in Digital Rights Ireland (Case C-293/12 and 594/12) (“DRI”) for being a disproportionate interference with the rights contained in Articles 7 and 8 EU Charter of Fundamental Rights (EUCFR).  While situated in the context of the Privacy and Electronic Communications Directive (Directive 2002/58), the judgment sets down principles regarding the interpretation of Articles 7 and 8 EUCFR which will be applicable generally within the scope of EU law. It also has possible implications for the UK’s post-Brexit relationship with the EU.

Background and Facts

The Privacy and Electronic Communications Directive requires the confidentiality of communications, including the data about communications to be ensured through national law. As an exception it permits, under Article 15, Member States to take measures for certain public interest objectives such as the fight against terrorism and crime, which include requiring public electronic communications service providers to retain data about communications activity. Member States took very different approaches, which led to the enactment of the Data Retention Directive (Directive 2006/24) within the space for Member State action envisaged by Article 15.  With that directive struck down, Article 15 remained the governing provision for exceptions to communications confidentiality within the field harmonised by the Privacy and Electronic Communications Directive.  This left questions as to what action in respect of requiring the retention of data could be permissible under Article 15, as understood in the light of the EUCFR.

The cases in today’s judgment derive from two separate national regimes. The first, concerning Tele2, arose when – following the DRI judgment – Tele2 proposed to stop retaining the data specified under Swedish implementing legislation in relation to the Data Retention Directive. The second arose from a challenge to the Data Retention and Investigatory Powers Act 2014 (DRIPA) which had been enacted to provide a legal basis in the UK for data retention when the domestic regime implementing the Data Retention Directive fell as a consequence of the invalidity of that directive.  Both sets of questions referred essentially asked about the impact of the DRI reasoning on national regimes, and whether Articles 7 and 8 EUCFR constrained the States’ regimes.

The Advocate General handed down an opinion in July (noted here) in which he opined that while mass retention of data may be possible, it would only be so when adequate safeguards were in place.  In both instances, the conditions – in particular those identified in DRI – were not satisfied.

Judgment

Scope of EU Law

A preliminary question is whether the data retention, or the access of such data by police and security authorities, falls within EU law.  While the Privacy and Electronic Communications Directive regulated the behaviour of communications providers generally, Article 1(3) of that Directive specifies that matters covered by Titles V and VI of the TEU at that time (e.g. public security, defence, State security) fall outside the scope of the directive, which the Court described as relating to “activities of the State” . Further Article 15(1) permits the State to take some measures resulting in the infringement of the principle of confidentiality found in Art 5(1) which again “concern activities characteristic of States or State authorities, and are unrelated to fields in which individuals are active” [para 72]. While there seems to be overlap between Article 1(3) and Article 15(1), this does not mean that matters permitted on the basis of Article 15(1) fall outside the scope of the directive as “otherwise that provision would be deprived of any purpose” [para 73]. 

In the course of submissions to the Court, a distinction was made between the retention of data (by the communications providers) and access to the data (by police and security services).  Accepting this distinction would allow a line to be drawn between the two, with retention as an activity of the commercial operator regulated by the Privacy and Electronic Communications Directive within its scope and the access, as an activity of the State lying outside it. The Court rejected this analysis and held that both retention and access lay within the field of the Privacy and Electronic Communications Directive [para 76]. It argued that Article 5(1) guarantees confidentiality of communications from the activities of third parties whether they be private actors or state authorities. Moreover, the effect of the national legislation is to require the communications providers to give access to the state authorities which in itself is an act of processing regulated by the Privacy and Electronic Communications Directive [para 78]. The Court also noted that the sole purpose of the retention is to be able to give such access.

Interpretation of Article 15(1)

The Court noted that the aim of the Privacy and Electronic Communications Directive is to ensure a high level of protection for data protection and privacy. Article 5(1) established the principle of confidentiality and that “as a general rule, any person other than the user is prohibited from storing, without the consent of the users concerned, the traffic data”, subject only to technical necessity and the terms of Article 15(1) (citing Promusicae) [para 85].  This requirement of confidentiality is backed up by the obligations in Article 6 and 9 specifically dealing with restrictions on the use of traffic and location data. Moreover, Recital 30 points to the need for data minimisation in this regard [para 87]. So, while Article 15(1) permits exceptions, they must be interpreted strictly so that the exception does not displace the rule; otherwise the rule would be “rendered largely meaningless” [para 89].

As a result of this general orientation, the Court held that Member States may only adopt measures for the purposes listed in the first sentence of Article 15(1) and those measures must comply with the requirements of the EUCFR.  The Court, citing DRI (at paras 25 and 70), noted that in addition to Articles 7 and 8 EUCFR, Article 11 EUCFR – protecting freedom of expression – was also in issue. The Court noted the need for such measures to be necessary and proportionate and highlighted that Article 15 provided further detail in the context of communications whilst Recital 11 to the Privacy and Electronic Communications Directive requires measures to be “strictly proportionate” [para 95].

The Court then considered these principles in the light of the reference in Tele2 at paras 97 et seq of its judgment. Approving expressly the approach of the Advocate General on this point, it  underlined that communications “data, taken as a whole, is liable to allow very precise conclusions to be drawn concerning the private lives of the persons whose data has been retained” and that such data is no less sensitive that content [para 99]. The interference in the view of the Court was serious and far-reaching in relation to Articles 7, 8 and 11.  While Article 15 identifies combatting crime as a legitimate objective, the Court – citing DRI - limited this so that only the fight against serious crime could be capable of justifying such intrusion.  Even the fight against terrorism “cannot in itself justify that national legislation providing for the general and indiscriminate retention of all traffic and location data should be considered necessary” [para 103].  The Court stressed that the regime provides for “no differentiation, limitation or exception according to objectives pursued” [para 105].  The Court did confirm that some measures would be permissible:

… Article 15(1) of Directive 2002/58, read in the light of Articles 7, 8 and 11 and Article 52(1) of the Charter, does not prevent a Member State from adopting legislation permitting, as a preventive measure, the targeted retention of traffic and location data, for the purpose of fighting serious crime, provided that the retention of data is limited, with respect to the categories of data to be retained, the means of communication affected, the persons concerned and the retention period adopted, to what is strictly necessary. [para 108]

It then set down some relevant conditions:

Clear and precise rules “governing the scope and application of such a data retention measure and imposing minimum safeguards, so that the persons whose data has been retained have sufficient guarantees of the effective protection of their personal data against the risk of misuse” [para 109].

while “conditions may vary according to the nature of the measures taken for the purposes of prevention, investigation, detection and prosecution of serious crime, the retention of data must continue nonetheless to meet objective criteria, that establish a connection between the data to be retained and the objective pursued” [110].

The Court then emphasised that there should be objective evidence supporting the public whose data is to be collected on the basis that it is likely to reveal a link, even an indirect one, with serious criminal offences, and thereby contribute in one way or another to fighting serious crime or to preventing a serious risk to public security. The Court accepted that geographical factors could be one such ground, on the basis that “that there exists, in one or more geographical areas, a high risk of preparation for or commission of such offences” [para 111].

Conversely,

…Article 15(1) of Directive 2002/58, read in the light of Articles 7, 8 and 11 and Article 52(1) of the Charter, must be interpreted as precluding national legislation which, for the purpose of fighting crime, provides for the general and indiscriminate retention of all traffic and location data of all subscribers and registered users relating to all means of electronic communication [para 112].

Acceptability of legislation where (1) the measure is not limited to serious crime; (2) where there is no prior review; and (3) where there is no requirement that the data stays in the EU.

This next section deals with the first question referred in the Watson case, as well as the Tele 2 reference.

As regards the first point, the answer following the Court’s approach at paragraphs 90 and 102 is clear: only measures justified by reference to serious crime would be justifiable.  As regards the second element, the Court noted that it is for national law to law conditions of access so as to ensure that the measure does not exceed what is strictly necessary.  The conditions must be clear and legally binding. The Court argued that since general access could not be considered strictly necessary, national legislation must set out by reference to objective criteria the circumstances in which access would be permissible.  Referring to the European Court of Human Rights (ECtHR) judgment in Zakharov, the Court specified:

access can, as a general rule, be granted, in relation to the objective of fighting crime, only to the data of individuals suspected of planning, committing or having committed a serious crime or of being implicated in one way or another in such a crime [para 119].

It then distinguished the general fight against crime from the fight against terrorism to suggest that in the latter case:

access to the data of other persons might also be granted where there is objective evidence from which it can be deduced that that data might, in a specific case, make an effective contribution to combating such activities [para 119].

The conditions set down must be respected. The Court therefore held that, save in cases of genuine emergency, prior review by an independent body must be carried out on the basis of a reasoned request by the investigating bodies. In making this point, the Court referred to the ECtHR judgment in Szabó and Vissy v. Hungary, as well as its own previous ruling in DRI. Furthermore, once there was no danger to the investigation by so doing, individuals affected should be notified, so as to those affected people the possibility to exercise their right to a remedy as specified in Article 15(2) read with Article 22 of the Data Protection Directive (Directive 95/46).

Article 15(1) permits derogation only in relation to specified provisions in the directive; it does not permit derogation with regard to the security obligations contained in Article 4(1) and 4(1a). the Court noted the quantity of data as well as its sensitivity to suggest that a high level of security measures would be required on the part of the electronic communications providers. Following this, the Court then stated:

…, the national legislation must make provision for the data to be retained within the European Union and for the irreversible destruction of the data at the end of the data retention period (see, by analogy, in relation to Directive 2006/24, the Digital Rights judgment, paragraphs 66 to 68) [para 122].

The Court noted that as a separate obligation from the approval of access to data, that States should ensure that independent review of compliance with the required regulatory framework was carried out by an independent body. In the view of the Court, this followed from Article 8(3) EUCFR. This is an essential element of individuals’ ability to make claims in respect of infringements of their data protection rights, as noted previously in DRI and Schrems

The Court then summarised the outcome of this reasoning, that Article 15 and the EUCFR:

must be interpreted as precluding national legislation governing the protection and security of traffic and location data and, in particular, access of the competent national authorities to the retained data, where the objective pursued by that access, in the context of fighting crime, is not restricted solely to fighting serious crime, where access is not subject to prior review by a court or an independent administrative authority, and where there is no requirement that the data concerned should be retained within the European Union. [para 125]

Relationship between the EUCFR, EU law and the ECHR

The English Court of Appeal had referred a question about the impact of the ECHR on the scope of the EUCFR in the light of Article 52 EUCFR. While the Court declared the question inadmissible, it –like the Advocate General – took the time to point out that the ECHR is not part of EU law, so the key issue is the scope of the EUCFR; and in any event Article 52(3) does not preclude Union law from providing protection that is more extensive than the ECHR. As a further point, the Court added that Article 8 EUCFR, which provides a separate right to data protection, does not have an exact equivalent in the ECHR and that there is therefore a difference between the two regimes.

Comment

Given the trend of recent case law, the outcome in this case is not surprising.  There are some points that are worth emphasising.

The first relates to the scope of EU law, which is a threshold barrier to any claim based on the EUCFR.  The Advocate General seemed prepared to accept a distinction between the retention of data and the access thereto (although conditions relating to the latter could bear on the proportionality of the former).  The Court took a different approach and held that the access also fell within the scope of the Directive/EU law, because the national regime imposed an obligation on the communications service provider to provide access to the relevant authorities. Given this was an obligation on the service provider, it fell within the regulatory schema.  This approach thus avoids the slightly unconvincing reasoning which the Advocate General adopted.  It also possibly enlarges the scope of EU law.

In general terms, the Court’s reasoning looks at certain provisions of the Privacy and Electronic Communications Directive.  While the reasoning is set in that context, it does not mean that the Court’s interpretation of the requirements deriving from the EUCFR is limited only to this set of surveillance measures.  The rules of interpretation of particularly Articles 7 and 8 could apply more generally – perhaps to PNR data (another form of mass surveillance) - and beyond.  It is also worth noting that according to a leaked Commission document, it is proposed to extend the scope of the Privacy and Electronic Communications Directive to other communications service providers not currently regulated by the directive, but who may be subject to some data retention requirements already.

Whilst the Court makes the point that Articles 7 and 8 EUCFR are separate and different, and that data retention implicates also Article 11 EUCFR, in its analysis of the impact of national measures providing for retention it does not deal with Articles 7 and 8 separately (contrast DRI where a limited consideration was given to this). Having flagged Article 11 EUCFR, it takes that analysis no further.  This is the leaves questions as to the scope of the rights, and particularly how Article 11 issues play out.

Note that the Court does not state that data retention itself is impermissible; indeed, it specifies circumstances when data retention would be acceptable. It challenges the compatibility of mass data retention with Articles 7 and 8 EUCFR, however, even in the context of the fight against terrorism.  In this, it is arguable that the Court has taken a tougher stance than its Advocate General on this point of principle.  In this we see a mirror of the approach in DRI, when the Court took a different approach to its Advocate General.  In that case too, the Advocate General focussed on safeguards and the quality of law, as has the Advocate General here. For the Court here, differentiation – between people and between types of offences and threats – based on objective, evidenced grounds is central to showing that national measures are proportionate and no more than – in the terms of the directive – strictly necessary. This seems to go close to disagreeing with the Opinion of the Advocate General that in DRI, the Court ‘did not, however, hold that that absence of differentiation meant that such obligations, in themselves, went beyond what was strictly necessary’ (Opinion, para 199). The Advocate General used this point to argue that DRI did not suggest that mass surveillance was per se unlawful (see Opinion, para 205). Certainly, in neither case did the Court expressly hold that mass surveillance was per so unlawful, so the question still remains. What is clear, however, is that the Court supports the retention of data following justified suspicion – even perhaps generalised suspicion – rather than using the analysis of retained data to justify suspicion.

In its reasoning, the Court did not –unlike the Advocate General – specifically make a ruling on whether or not the safeguards set down in DRI, paras 60-68, should be seen as mandatory – in effect creating a 6 point check list. Nonetheless, it repeatedly cited DRI approvingly. Within this framework, it highlighted specific aspects – such as the need for prior approval; the need for security and control over data; a prohibition on transferring data outside the EU; the need for subjects to be able to exercise their right to a remedy. Some of these points will be difficult to reconcile with the current regime in the United Kingdom regarding communications data.

It did not, however, touch on acceptable periods for retention (even though it – like its Advocate General – referred to Zakharov). More generally, the Court’s analysis – by comparison with that of the Advocate General – was less detailed and structured, particularly about the meaning of necessity and proportionality. It did not directly address the points the Advocate General made about lawfulness, with specific reference to reliance on codes (an essential feature of the UK arrangements); it did in passing note that the conditions for access to data should be binding within the domestic legal system. Is this implicit agreement with the Advocate General on this point? It certainly agreed with him that the seriousness of the interference meant that data retention of communications data should be restricted to ‘serious crime’ and not just any crime.

One final issue relates to the judicial relationship between Strasbourg and Luxembourg.  Despite emphasising that the ECHR is not part of EU law, the Court relies on two recent cases from the ECtHR, perhaps seeking to emphasis the consistency in this area between the two courts – or perhaps seeking to put pressure on Strasbourg to hold the line as it faces a number of state surveillance cases on its own docket, many against the UK. The position of Strasbourg is significant for the UK. While many assume that the UK will maintain the GDPR after Brexit in the interests of ensuring equivalence, it could be that the EUCFR will no longer be applicable in the UK post-Brexit. For UK citizens, the ECHR then is the only route to challenge state intrusion into privacy. For those in the EU, data transfers to the UK post-Brexit could be challenged on the basis that the UK’s law is not sufficiently adequate compared to EU standards. Today’s ruling – and the UK’s response to it, if any – could be a significant element in arguing that issue.

Barnard & Peers: chapter 9

Photo credit: www.cio.com.au

Wednesday, 14 December 2016

Early Christmas present or a lump of coal?: Data retention and the leaked ePrivacy Regulation





Matthew White, Ph.D candidate, Sheffield Hallam University


On 12 December 2016, a document containing the draft of the ePrivacy Regulation (draft Regulation) was leaked. This has resulted in some commentary (here and here) highlighting the good, the bad and even missing points. This post deals only with the data retention aspect.

Prior to the leak, earlier this year, the Article 29 Data Protection Working Party (A29DPWP) in its opinion on the evaluation and review of the ePrivacy Directive observed that:

The EC should explicitly state that it will not introduce any new European data retention requirement. Any similar retention of communications data in general must be prohibited in the revised ePrivacy instrument. (p8).     

This of course is referring to Article 15(1) of the current ePrivacy Directive, which Advocate General Saugmandsgaard Øe in the joined cases of C203/15 and C698/15 Watson and Tele2 opined puts general data retention obligations within the scope of the ePrivacy Directive (paras 84-95) and thus EU law. The AG further observed that Article 15(1) gave Member States a choice as to whether they should adopt national data retention regimes (para 106). Further, the AG maintained that the ePrivacy Directive did not preclude Member States from taking other measures necessary for the protection of public security etc (para 117).

The A29DPWP’s opinion is reflected in the draft Regulation, in the last paragraph of section 1.3 (p4). It states that the draft Regulation does not include any specific provision in the field of data retention, but Member State would remain able to establish and maintain national data retention legislation so long as they comply with general principles of EU law and the Charter of Fundamental Rights (CFR). This falls in line with the AG in Watson and Tele 2 insofar that Member States can take other measures necessary e.g. data retention for the protection of public security etc.

This ability to adopt national data retention legislation is implied in Article 11 which stipulates that the EU and Member States may restrict (by legislative means) the obligations and rights provided for by Articles 5, 6, 7 and 8 of the draft Regulation when they respect the essence of those rights and if it is necessary, appropriate and proportionate in a democratic society to safeguard a list of objectives. These restrictions must in accordance with the CFR, particularly Articles 7, 8, 10 and 52.

From Article 11, it is clear that at an EU and Member State level, data retention obligations can still be created. In contrast to the current provision in Article 15(1), there is no mention of the restrictions being in conformity with general principles of EU law or Article 6(1) and (2) of the Treaty of the European Union (TEU). More specifically, Article 6(3) of the TEU regards the European Convention on Human Rights (ECHR) as general principals of EU law. It is not clear why this has been omitted from Article 11, but the protection of fundamental rights should not be based on the exclusive interpretation of the CFR. Although compliance with the ECHR is mentioned in Recital 10 and 30, it should be mentioned in Article 11 itself as the Court of Justice of the European Union (CJEU) noted in Case C-162/97 Nilsson that ‘the preamble to a Community act has no binding legal force and cannot be relied on as a ground for derogating from the actual provisions of the act in question’ (para 54). What if there is diverging jurisprudence between the ECHR and the CFR, what if the former better protects fundamental rights than the latter in a particular circumstance?

This relates to the next issue; Article 11 only allows restrictions that respect the essence of the right. In Schrems the CJEU regarded the transfer of data from the EU to the US (under the Safe Harbour rules) compromised the essence of the right because of the generalised access to the content of electronic communications (para 94) and therefore ruled it invalid (para 107). This may also be the case, if Brexit happens, for many of the provisions of the Investigatory Powers Act 2016 (IPA 2016) when it comes into force in 2017.

I say many of the provisions, but this may not be the case for data retention (who concerns, for instance, information about who someone called, texted or e-mailed, as distinct from the content of those communications). In Case C203/15 Digital Rights Ireland the CJEU held that general data retention obligations do not adversely affect the essence of Article 7 (right to privacy) and Article 8 (data protection) of the CFR (paras 39-40). This already gives Member States unjustified leeway when it comes to national data retention, even more significantly in that the CJEU felt that a general data retention obligation ‘genuinely satisfies an objective of general interest’ (para 44). Therefore, a data retention obligation by itself, according to EU law, would actually respect the essence of the right.

I have said before that this construction of data retention is damaging to fundamental rights and I will say it again. The AG in Watson and Tele2 acknowledges that data retention is just as serious as interception (para 254), yet did not feel this was enough to adversely affect the essence of the right. Both the AG and CJEU do not fully appreciate just how revealing communications (or meta) data truly are, this is shown through their differential treatment of content, despite communications data and content being thinly (if it even can be anymore) distinguished. The CJEU and AG primarily focus on access mechanisms, rather than the fact that the initial interference, and arguably destruction of the right (and this is more than just about privacy and data protection) posed by data retention. This creates a conflict with the ECHR, as a violation can occur irrespective of the access mechanisms. This highlights the importance of (re)adding compliance with the ECHR into Article 11 of the draft Regulation and not to leave it in the preamble, because in this particular context, the interpretation of the CFR does not, ironically, fully protect those fundamental rights.

The CJEU is set to hand down its judgment in Watson and Tele2 on 21 December 2016. If they follow the AG in that judicial or independent authorisation for access to communications data is to be regarded as mandatory (para 221) then Part 3 of the IPA 2016 is going to have to be revised. But therein lies the problem, firstly, if the CJEU does not change its stance, then data retention will be acceptable in the EU. Therefore, this may also be acceptable in third countries like the US or even Australia where the Telecommunications (Interception and Access) Amendment (Data Retention) Act 2015 requires judicial authorisation (6DC Part 41 issuing authorities). This assumes that respect for fundamental rights is primarily based on the independence of the issuing authority, where the UK can further claim retention notices are in fact issued by judges (see s.89 of the IPA 2016). But this is an oversimplication of the issue as a transfer of competence does not reduce the infringing capability of data retention, all it does is ensure higher degrees of independence (see forthcoming Matthew White, Protection by Judicial Oversight, or an Oversight in Protection? (2017)).   

And so, while mandating judicial or independent authorisation of access to communications data would be a welcomed step in safeguarding fundamental rights. This early Christmas present may in fact be a lump of coal waiting to be opened. This is because as EU law is likely to stand, there is nothing wrong with general obligations to retain.


Cartoon credit: Royston, The New Yorker