Showing posts with label DRIPA. Show all posts
Showing posts with label DRIPA. Show all posts

Thursday, 22 February 2018

Illuminating the Investigatory Powers Act


As full implementation of the Investigatory Powers Act (IPAct) draws closer we can usefully ponder some of its more ticklish points of interpretation. These will serve to delineate the IPAct's powers, crystallise the legislation's procedural requirements and determine who can be compelled to do what.

Unlike its predecessor, the Regulation of Investigatory Powers Act 2000 (RIPA), the IPAct comes with expectations of openness and transparency.  The Act itself exposes a panoply of powers to the public gaze.  But despite its 300 pages of detail, decisions will still have to be made about the meaning of some provisions and how they are to be applied.

Previously such legal interpretations have tended to come to light, if at all, as a consequence of the Snowden revelations or during litigation brought by civil liberties organisations. Examples include the meaning of ‘external’ communications under RIPA, the legal basis for thematic interception warrants under RIPA, and the use of S.94 Telecommunications Act 1984 powers to acquire bulk communications data from telecommunications companies.

In the field of surveillance, hidden legal interpretations influencing how powers are wielded are in substance as much part of the law as the statute that grants the powers.  This can be problematic when a cornerstone of the rule of law is that laws should be publicly promulgated. People should be able to know in advance the kind of circumstances in which the powers are liable to be used and understand the manner of their exercise. According to jurisprudential taste, secret law is either bad law or not law at all.

The new Investigatory Powers Commissioner has an opportunity to bring to public view legal interpretations that will mould the use of the IPAct's surveillance powers. 

Most IPAct powers require approval by a Judicial Commissioner or, as now proposed for communications data acquisition, a new Office for Communications Data Authorisations. The Judicial Commissioner or other reviewer may have to form a view about some provision of the Act when approving a warrant or notice.  Some interpretations may have significance that goes wider than a single approval.

Under the IPAct there is scope for an adopted interpretation to be published if that can be done without breaching the Commissioner's responsibilities not to act contrary to the public interest, nor prejudice national security or the prevention or detection of serious crime or the economic well-being of the UK.

What interpretations of the IPAct will have to be considered? The most heavily debated has been the level of scrutiny that Judicial Commissioners are required to apply to Ministerial decisions to issue warrants and technical capability notices. Gratefully donning my techlaw hat, I shall leave that problem to the public and administrative law experts who have been mulling over it since the draft Bill was published in November 2015.

Approval decisions will typically involve assessments of necessity and proportionality. These will by their nature be fact-sensitive and so more difficult to make public without revealing operational matters that ought to remain secret. Nevertheless some general approaches may be capable of being made public.

Among the most likely candidates for publication will be points of statutory construction: aspects of the IPAct's language that require a view to be taken of their correct interpretation.  

I have drawn up a list of provisions that present interpretative challenges of varying degrees of significance. Some of the points are old hobbyhorses, dating back to my comments on the original draft Bill. Others are new. No doubt more will emerge as the IPAct is put into practice.

BULK INTERCEPTION

Selection for examination

What is the issue?

Under a bulk interception warrant what kinds of activities count as selection for examination of intercepted content or secondary data? While the question can be simply put, the answer is not so easy.

Why is it significant?

Selection for examination underpins three provisions of the IPAct.

First, a separate targeted examination warrant must be obtained before selecting intercepted content for examination by use of criteria (such as an e-mail address) referable to an individual known to be in the British Islands, if the purpose is to identify the content of communications sent by or intended for that individual. (S.152(4)) (However, a targeted examination warrant is not required for secondary data. As to what is meant by secondary data, see below.)

Second, it is an offence (subject to applicable knowledge and intent thresholds) to select intercepted content or secondary data for examination in breach of the Act's safeguards. (S.155)

Third, a bulk interception warrant authorising selection for examination must describe the manner in which intercepted content or secondary data will be selected for examination and the conduct by which that activity will be secured (S.136(4)(c)).

The S.136(4)(c) requirement is new compared with the equivalent provisions of RIPA. Curiously, it is not referred to in the draft Interception Code of Practice

It is important to know what activities amount to selection for examination.  This is a particular issue with automated processing.

Possible interpretations?

Examination means being read, looked at or listened to (S.263) But what activities are caught by selection for examination? How close a nexus does there have to be between the selection and any subsequent examination?  Does there have to be a specific intention to examine the selected item (for instance when an analyst makes a search request on a database)? Does selection for possible examination suffice?  (It is perhaps of interest that David Anderson Q.C.'s Bulk Powers Review at para 2.17 discusses under the heading of ‘Selection for Examination’ the use of strong and weak selectors to select material for “possible examination” by analysts.)

The Draft Interception Code of Practice describes a sequence of steps from obtaining the data through to examination by an analyst. It uses the term 'selection for examination' in ways that could refer to both selection by the analyst and intermediate processing steps:
"In practice, several different processing systems may be used to effect the interception and/or the obtaining of secondary data, and the selection for examination of the data so obtained. 
These processing systems process data from the communications links or signals that the intercepting authority has chosen to intercept. A degree of filtering is then applied to the traffic on those links and signals, designed to select types of communications of potential intelligence value whilst discarding those least likely to be of intelligence value. As a result of this filtering, which will vary between processing systems, a significant proportion of the communications on these links and signals will be automatically discarded. Further complex searches may then take place to draw out further communications most likely to be of greatest intelligence value, which relate to the agency’s statutory functions. These communications may then be selected for examination for one or more of the operational purposes specified in the warrant where the conditions of necessity and proportionality are met. Only items which have not been filtered out can potentially be selected for examination by authorised persons." (emphasis added)
If selection for examination encompasses only the action of an analyst querying a database then S.136(4)(c) would still require the warrant to describe the manner in which an analyst could select content or secondary data for examination. That could include describing how analysts can go about searching databases. It might also cover the operation of Query Focused Datasets (databases in which the data is organised so as to optimise particular kinds of queries by analysts).

But does selection for examination exclude all the automated processing that takes place between bulk capture and storage? There appears to be no reason in principle why automated selection should be excluded, if the selection is 'for examination'.  

Details of the kinds of automated processing applied between capture and storage are mainly kept secret.  However some clues beyond the draft Code of Practice can be obtained from the Intelligence and Security Committee Report of March 2015 and from the Bulk Powers Review.  The Bulk Powers Review describes a process that uses ‘strong selectors’ (telephone number or email address) to select items in near real time as they are intercepted:

“As the internet traffic flows along those chosen bearers, the system compares the communications against a list of strong selectors in near real-time. Any communications which match the selectors are automatically collected and all other communications are automatically discarded.”

Such selection against a list of e-mail addresses or telephone numbers of interest is not made for any purpose other than examination, or at least possible examination. But does it count as selection for examination if (as described in the Bulk Powers Review) a further triage process may be applied?

“Even where communications are known to relate to specific targets, GCHQ does not have the resources to examine them all. Analysts use their experience and judgement to decide which of the results returned by their queries are most likely to be of intelligence value and will examine only these.”

Weaker selectors may relate to subject-matter and be combined to create complex non-real time queries which determine what material is retained for possible examination after triage. Pattern matching algorithms could perhaps be used to flag up persons exhibiting suspicious behavioural traits as candidates for further investigation.

The question of which, if any, of these processes amount to selection for examination is of considerable significance to the operation of the processes mandated by the IPAct.

Secondary data

What is the issue?

'Secondary data' under the IP Act has been extended, compared with RIPA's equivalent ‘related communications data’, so as to include some elements of the content of a communication. However the definition is difficult to apply and in some respects verges on the metaphysical.  

Why is it significant?

Secondary data, despite its name, is perhaps the most important category of data within the IP Act. It is, roughly speaking, metadata acquired under a targeted, thematic or bulk interception warrant. As such it is not subject to all the usage restrictions that apply to intercepted content.

In particular, unlike for content, there is no requirement to obtain a targeted examination warrant in order to select metadata for examination by use of a selector (such as an e-mail address) referable to someone known to be in the British Islands.

The broader the scope of secondary data, therefore, the more data can be accessed without a targeted examination warrant and the more of what would normally be regarded as content will be included.

Possible interpretations?

Under S.137 of the IPAct secondary data includes:

“identifying data which -

(a) is comprised in, included as part of, attached to or logically associated with the communication (whether by the sender or otherwise),
(b) is capable of being logically separated from the remainder of the communication, and
(c) if it were so separated, would not reveal anything of what might reasonably be considered to be the meaning (if any) of the communication, disregarding any meaning arising from the fact of the communication or from any data relating to the transmission of the communication.”

Identifying data is data which may be used to identify, or assist in identifying, any person, apparatus, system or service, any event, or the location of any person, event or thing.

Identifying data is itself broadly defined. It includes offline as well as online events, such as date or location data on a photograph. However the real challenge is in understanding (c). How does one evaluate the ‘meaning’ of the communication for these purposes? If a name, or a location, or an e-mail address, or a time is extracted from the communication does that on its own reveal anything of its meaning? Is each item extracted to be considered on its own, or are the extracted items of data to be considered together?  How is the ‘meaning’ of a machine to machine communication to be evaluated? Is the test what the communication might mean to a computer or to a human being?

A list of the specific types of data that do and do not fall either side of the line can be a useful aid to understanding abstract data-related definitions such as this. Among the Snowden documents was a GCHQ internal reference list distinguishing between content and related communications data under RIPA.

TECHNICAL CAPABILITY NOTICES

Applied by or on behalf of

What is the issue?

A technical capability notice (TCN) can require a telecommunications operator to install a specified capability to assist with any interception, equipment interference or bulk acquisition warrant, or communications data acquisition notice, that it might receive in the future.

In particular a TCN can require a telecommunications operator to have the capability to remove electronic protection applied by or on behalf of that operator to any communications or data. This includes encryption. But when is encryption applied "by or on behalf of" that operator?

Why is it significant?

During the passage of the Bill through Parliament there was considerable debate about whether a TCN could be used to stop a telecommunications operator providing end to end encryption facilities to its users. The question was never fully resolved. One issue that would arise, if an attempt were made to use TCNs in that way, is whether the E2E encryption was applied by or on behalf of the operator. If not, then there would be no jurisdiction to issue a TCN in relation to that encryption facility.

Possible interpretations?

In principle, encryption could be applied by the operator, by the user, or by both. An operator would no doubt argue that under the E2E model it is providing the user only with the facility to apply encryption and that any encryption is applied by the user, not the operator.  The strength of that argument could vary depending on the precise technical arrangements in a particular case.

MANDATORY DATA RETENTION

Obtaining data by generation

What is the issue?

The IP Act empowers the Secretary of State, with the approval of a Judicial Commissioner, to give a communications data retention notice to a telecommunications operator. A notice can require the operator to retain specified communications data for up to 12 months.

A data retention notice may, in particular, include:

“requirements or restrictions in relation to the obtaining (whether by collection, generation or otherwise), generation or processing of (i) data for retention, or (ii) retained data.”

This provision makes clear that a requirement to retain data can include obtaining or generating data for retention. But what exactly does that mean? In particular, why does ‘obtaining’ data for retention include ‘generation’?

Why is it significant?

Mandatory communications data retention is one of the most controversial aspects of the IP Act. It is under challenge in the courts and, as a result of previous legal challenges, the government is already having to consult on amendments to the Act.

The powers to require data retention are broader in every respect than those in the predecessor legislation, the Data Retention and Investigatory Powers Act 2014. They can be used against private, not just public, telecommunications operators. They cover a far wider range of data. And they can require data be obtained and generated, not just retained.

So the width of these new powers is significant, especially as telecommunications operators are required not to disclose the existence of data retention notices to which they are subject.

Possible interpretations?

What does it mean to ‘obtain’ data by ‘generation’? It apparently means something different from just generating data for retention, since that is spelt out separately. The most far reaching interpretation would be if the notice could require the operator to require a third party to generate and hand over communications data to the operator. Could that be used to compel, say, a wi-fi operator to obtain and retain a user's identity details?

There was no suggestion during the Parliamentary debates that it could be used in that way, but then the curious drafting of this provision received no attention at all.

INTERNET CONNECTION RECORDS

‘Internet service’ and ‘internet communications service’

What is the issue?

The IPAct uses both ‘internet service’ and ‘internet communications service’ in its provisions that set out the limits on public authority access to internet connection records (ICRs). However it provides no definitions. Nor are these well understood industry or technical terms.

Why is it significant?

ICRs are logs of visited internet destinations such as websites. ICRs are particularly sensitive since they can be a rich source of information about someone’s lifestyle, health, politics, reading habits and so on. The IP Act therefore places more stringent limits, compared with ordinary communications data, on the authorities that may access ICRs and for what purposes.

The Act stipulates several purposes for which, in various different circumstances, a public authority can access ICRs. They include:
  • to identify which person or apparatus is using an internet service where the service and time of use are already known. (S.62(3))
  • to identify which internet communications service is being used, and when and how it is being used, by a person or apparatus whose identity is already known. (S.62(4)(b)(i) and S.62(5)(c)(i))
  • to identify which internet service is being used, and when and how it is being used, by a person or apparatus whose identity is already known. (S.62(4)(b) (iii) and S.62(5)(c) (iii))

The second and third purposes apply identically to internet services and internet communications services. The first purpose applies only to internet services.

The purposes for which the powers can be used may therefore differ, depending on whether we are dealing with an internet service or an internet communications service. But as already noted, the Act does not tell us what either of these terms means.

Possible interpretations?

We can find clues to interpretation in the footnotes to the draft Communications Data Code of Practice. 

Footnote 49 says that an ‘internet service’ is a service provided over the internet. On the face of it this would seem to exclude a service consisting of providing access to the internet. However the example illustrating S.62(3) in paragraph 9.6 of the draft Code suggests differently.

Footnote 49 goes on to say that 'internet service' includes ‘internet communication services, websites and applications.’ It also suggests examples of online travel booking or mapping services.

This explanation presents some problems.

First is the suggestion that internet communication services are a subset of internet services. If that is right then subsections 62(4)(b)(i) and 62(5)(c)(i) of the Act (above, internet communication services) are redundant, since the respective subsections (iii) already cover internet services in identical terms.

If ‘internet communication service’ is redundant, then the uncertainties with its definition may not signify since S.62 can simply be applied to any 'internet service'.

Elsewhere the draft Code suggests that the subsections (iii) relate to ‘other’ internet services (i.e. additional to internet communications services covered by subsections (i)). However that language does not appear in the Act.

Second is the suggestion that websites and applications are different from internet communications services.  On the face of it an internet communication service could mean just e-mail or a messaging service. But if so, what are we to make of ‘applications’ as something different, since many messaging services are app-based?

Last, to add to the confusion, footnote 48 of the Draft Code of Practice says that an internet communication service is a service which provides for the communication between one or more persons over the internet and ‘may include’ email services, instant messaging services, internet telephony services, social networking and web forums.

This goes wider than just e-mail and messaging services. Does it, for instance, include online games with the ability to chat to other players?  In context does ‘person’ refer only to a human being, or does it include machine communications?

Those involved in authorising and approving applications for access to ICRs will have to take a view on what these terms mean and how they fit together within the scheme of the Act. 

Material whose possession is a crime

What is the issue?

Another ground on which access to ICRs may be obtained is to identify where or when a known person is accessing or running a file or program which “wholly or mainly involves making available, or acquiring, material whose possession is a crime”. There are relatively few offences that are committed by mere possession of material. Illicit drugs and indecent images of children are two mentioned in the draft Code of Practice.

Why is it significant?

The width of the definition affects what kinds of criminal activity can be the subject of applications to access ICRs under this head.

Possible interpretations?

Does the section apply more widely than mere possession, for instance where possession is an offence only if it is with a view to some other activity? What about possession offences where possession is not an offence if it is for personal use?

COMMUNICATIONS DATA

URLs up to the first slash

What is the issue?

It has long been understood that under RIPA the portion of a web address to the right of the first slash is content, but otherwise the URL is communications data. RIPA contained a convoluted definition designed to achieve that result. Although the Home Office says that the IPAct achieves the same result, exactly how the definitions achieve that is not always obvious.

Why is it significant

Communications data retention and acquisition powers can be deployed only against communication data, not content. So it is important to know what is and is not content.  It is especially important for Internet Connection Records, which the Home Office has repeatedly said include top-level web addresses but not page URLs.

In June 2015, in A Question of Trust at paragraph 9.53, David Anderson Q.C. said that the Home Office had provided him with this definition of 'weblogs' (now known as ICRs):

“Weblogs are a record of the interaction that a user of the internet has with other computers connected to the internet. This will include websites visited up to the first ‘/’ of its [url], but not a detailed record of all web pages that a user has accessed. This record will contain times of contacts and the addresses of the other computers or services with which contact occurred.”

He went on:

"Under this definition a web log would reveal that a user has visited e.g. www.google.com or www.bbc.co.uk, but not the specific page."

He also noted  that:

"Under the current accepted distinction between content and CD, www.bbc.co.uk would be communications data while www.bbc.co.uk/sport would be content; and this is set out in the Acquisition Code. However there are arbitrary elements to that definition – for example sport.bbc.co.uk (no ‘www.’) takes you to the same place as www.bbc.co.uk/sport.”

Possible interpretations

The House of Commons Science and Technology Committee criticised the data definitions in the draft Bill.  They remain complex and abstract in the final legislation.

Towards the end of the pre-Bill scrutiny the Home Office submitted evidence to the Joint Committee that gave more information about what kinds of data would constitute communications data and ICRs. 

In the table at Annex A para 20 of its written evidence the Home Office classified as ‘content’ the following:

“The url of a webpage in a browsing session (e.g. www.bbc.co.uk/news/story or news.bbc.co.uk or friend’sname.facebook.com)”

The first example reflected the prior understanding that a full URL is content. The second and third examples (subdomains) depart from the previous understanding set out in the above extract from ‘A Question of Trust’ by classifying the material to the left of the first slash as content.

Whatever the merits of this approach in removing some of the arbitrariness noted by David Anderson, it is difficult to find anything in the legislation that draws the line at the point suggested. The Home Office evidence gave no explanation of why it drew the line where it did. 

The draft Communications Data Code of Practice does not address the point specifically, but its explanation of fully qualified domain names at page 17 might perhaps suggest that the Home Office has now reverted to the original position described in A Question of Trust.

Given the sensitivity of ICRs this is an area in which clarity is important, not just for ISPs who are subject to the IPAct's requirements but also so that the general public can know what kinds of data are potentially subject to retention and access. 

This is another example pointing to the desirability of publishing a comprehensive list of datatypes illustrating what kinds of data fall into which categories and, by reference to the definitions in the IPAct itself, why they do so.



Monday, 2 January 2017

Cyberleagle on Surveillance

For over two years I have been blogging on surveillance, a topic that cuckoo-like has grown to crowd out most other IT and internet law topics on this blog. 

With the Investigatory Powers Act now on the UK statute book, this seems like a good moment to catalogue the 43 posts that this legislation and its preceding events have inspired.

20 August 2013: Everyman encounters Government. Prompted by reactions to Snowden. It's all about trust.


{8 April 2014: CJEU invalidates EU Data Retention Directive in Digital Rights Ireland. Validity of UK implementation by secondary legislation questionable.}

12 July 2014: Dissecting DRIP - the emergency Data Retention and Investigatory Powers Bill. Posted the day after the coalition government’s Friday publication of the DRIP Bill for introduction into Parliament on the Monday morning, on an emergency four day timetable. Still by far the most page views of any post on this blog. 

20 July 2014: The other side of communications data. Statistics on communications data acquisition errors with serious consequences: wrong accusations, search warrants, arrests. Updated since then with data from subsequent IOCCO Annual Reports. 

10 October 2014: Submissions to the Investigatory Powers Review. Various submissions (including mine) to David Anderson QC’s Investigatory Powers Review.

15 November 2014: Of straws and haystacks. Tracing the history of RIPA’s S.8(4) bulk interception power via the 1960s cable vetting scandal to S.4 of the Official Secrets Act 1920.

3 December 2014: Another round of data retention. The IP address resolution provisions of the Counter-Terrorism and Security Bill, amending DRIPA.

21 December 2014: A Cheltenham Carol. Five Ba-a-aack Doors.

2 January 2015: The tangled net of GCHQ’s fishing warrant. Detailed analysis of the S.8(4) RIPA bulk interception warrant.

2 February 2015: IP address resolution - a conundrum still unresolved? A short rant about the Counter-Terrorism and Security Bill.

{11 June 2015: "A Question of Trust" published.}

13 July 2015: Red lines and no-go zones - the coming surveillance debate. Discussion of  David Anderson Q.C.'s Investigatory Powers Review report "A Question of Trust".

12 August 2015: The coming surveillance debate. A 13 part series of posts analysing specific topics likely to feature in the forthcoming Investigatory Powers Bill.

5 September 2015: Predicting the UK’s new surveillance law. Nine predictions for the contents of the Bill covering bulk interception, broad Ministerial powers, browsing histories, digital footprints, data generation by decree, communications data/content boundary, third party data collection, request filter and judicial authorisation.

{4 November 2015: Draft Investigatory Powers Bill published.}

4 November 2015: Prediction and Verdict - the draft Investigatory Powers Bill. Contents of the draft Bill versus my 5 September predictions.

9 November 2015: From Oversight to Insight - Hidden Surveillance Law Interpretations. Arguing that the oversight body should proactively seek out and make public material legal interpretations on the basis of which powers are exercised or asserted.

23 December 2015: #IPBill Christmas Quiz. A bit of seasonal fun with the draft Bill, including the never to be forgotten definition “Data includes any information which is not data”. Five out of the ten points highlighted, including that one, have changed in the final legislation.

16 January 2016: An itemised phone bill like none ever seen. Adapted from my evidence to the pre-legislative scrutiny Joint Committee, analysing how internet connection records are richer, more far reaching and different in nature from the traditional itemised phone bill with which the government was at that stage inclined to compare them. 

7 February 2016: No Content: Metadata and the draft Investigatory Powers Bill. Highlighting the significance of communications data powers in the draft Bill.

16 February 2016: The draft Investigatory Powers Bill - start all over again? Discussion of the Joint Committee and ISC Reports on the draft Bill.

{1 March 2016: Investigatory Powers Bill introduced into Parliament.}

15 March 2016: Relevant Communications Data revisited. Parsing and visualising one of the most complex and critical definitions in the Bill.

19 March 2016: 20 points on the Investigatory Powers Bill, from future proofing to triple negatives. Storified 20 points tweeted immediately before publication of the Bill, with subsequent comments in the light of the Bill.

24 March 2016: All about the metadata. More visualisations of the Bill’s complex web of metadata definitions.

29 March 2016: Woe unto you, cryptographers! This little collection of Biblical quotations adapted to cryptography fell flat as a pancake…

1 April 2016: An official announcement. …but not as flat as this leaden attempt at an April Fool.

15 April 2016: Future-proofing the Investigatory Powers Bill. Arguing that the Bill’s attempt to future-proof powers by adopting a technologically neutral drafting approach repeats the error of RIPA. A better approach would be to future-proof the privacy-intrusion balance.

26 May 2016: The content v metadata contest at the heart of the Investigatory Powers Bill. A deep dive into the Bill’s dividing lines between content and metadata, including the new power of the intelligence agencies to extract some content and treat it as metadata. 

12 June 2016: The List. Dystopia looms, holding a clipboard.

19 July 2016: Data retention - the Advocate General opines. Summary of the Advocate General’s Opinion in the Watson/Tele2 case challenging DRIPA and the equivalent Swedish legislation.

11 August 2016: How secondary data got its name. An imagined Bill drafting committee meeting in Whitehall.

{19 August 2016: Bulk Powers Review published.}

7 September 2016: A trim for bulk powers? What might have been if the Bulk Powers Review had been commissioned and published at the start of the Parliamentary process.

{29 November 2016: Investigatory Powers Act gains Royal Assent.}

10 December 2016: Investigatory Powers Act 2016 Christmas Quiz. 20 questions to test your knowledge of the #IPAct. 

31 December 2016: The Investigatory Powers Act - swan or turkey? A post-legislative reflection on the Act.  

This marks the end of the beginning. Pending legal challenges, new legal challenges and Brexit will provide a rich seam of material for future blogging.

8 May 2017: Back doors, black boxes and #IPAct technical capability regulations Commentary on proposed technical capability notice regulations.

22 February 2018: Illuminating the Investigatory Powers Act Tricky points of legal interpretation of the Act.

27 April 2018: The IPAct data retention regime lives on (but will have to change before long) Report on the judgment in Liberty v Home Office on compliance with EU law of the mandatory data retention regime.

13 September 2018 Big Brother Watch v UK – implications for the Investigatory Powers Act? Commentary on the European Court of Human Rights First Section judgment.

30 October 2018 What will be in Investigatory Powers Act Version 1.2? Discussion of how the Act might have to be amended in the light of the Big Brother Watch First Section decision.

15 October 2020 Hard questions about soft limits Implications of the CJEU judgments in Privacy International/La Quadrature du Net.

8 June 2021 Big Brother Watch/Rättvisa – a multifactorial puzzle Analysis of the Grand Chamber Big Brother Watch judgment.

[Amended 21.25 2 Jan 2017 to add some {contextual events} and stylistic edits; and 6 March 2023 to add subsequent posts.]

Saturday, 31 December 2016

The Investigatory Powers Act - swan or turkey?

The Investigatory Powers Bill, now the newly minted Investigatory Powers Act, has probably undergone more scrutiny than any legislation in recent memory. Rarely, though, can the need for scrutiny have been so great.

Over 300 pages make up what then Prime Minister David Cameron described as the most important Bill of the last Parliament. When it comes into force the IP Act will replace much of RIPA (the Regulation of Investigatory Powers Act 2000), described by David Anderson Q.C.’s report A Question of Trust as ‘incomprehensible to all but a tiny band of initiates’. It will also supersede a batch of non-RIPA powers that had been exercised in secret over many years - some, so the Investigatory Powers Tribunal has found, on the basis of an insufficiently clear legal framework. 
None of this would have occurred but for the 2013 Snowden revelations of the scale of GCHQ’s use of bulk interception powers. Two years post-Snowden the government was still acknowledging previously unknown (except to those in the know) uses of opaque statutory powers. 
Three Reviews and several Parliamentary Committees later, it remains a matter of opinion whether the thousands of hours of labour that went into the Act have brought forth a swan or a turkey. If the lengthy incubation has produced a swan, it is one whose feathers are already looking distinctly ruffled following the CJEU judgment in Watson/Tele2, issued three weeks after Royal Assent. That decision will at a minimum require the data retention aspects of the Act to be substantially amended. 
So, swan or turkey?
Judicial approval
On the swan side warrants for interception and equipment interference, together with most types of power exercisable by notice, will be subject to prior approval by independent Judicial Commissioners. For some, doubts persist about the degree of the scrutiny that will be exercised. Nevertheless judicial approval is a significant improvement on current practice whereby the Secretary of State alone takes the decision to issue a warrant.
Codified powers
Also swan-like is the impressive 300 page codification of the numerous powers granted to law enforcement and intelligence agencies. A Part entitled ‘Bulk warrants’ is a welcome change from RIPA’s certificated warrants, which forced the reader to play hopscotch around a mosaic of convoluted provisions before the legislation would give up its secrets.
Granted, the IP Act also ties itself in a few impenetrable knots. Parts are built on shaky or even non-existent definitional foundations. But it would be churlish not to acknowledge the IP Act’s overall improvement over its predecessors. 
Parliamentary scrutiny
When we move to consider the Parliamentary scrutiny of bulk powers things become less elegant.
The pre-legislative Joint Committee acknowledged that the witnesses were giving evidence on the basis of incomplete information. In response to the Joint Committee’s recommendation the government produced an Operational Case for Bulk Powers alongside the Bill’s introduction into Parliament. That added a little light to that which A Question of Trust had previously shed on the use of bulk powers. 
But it was only with the publication of David Anderson’s Bulk Powers Review towards the end of the Parliamentary process that greater insight into the full range of ways in which bulk powers are used was provided from an uncontroversial source. (By way of example ‘selector’ - the most basic of bulk interception terms - appears 27 times in the Bulk Powers Review, five times in A Question of Trust and twice in the Operational Case, but not at all in either the Joint Parliamentary Scrutiny Committee Report or the Intelligence and Security Committee Report.)
By the time the Bulk Powers Review was published it was too late for the detailed information within it to fuel a useful Parliamentary debate on how any bulk powers within the Act should be framed. David Anderson touched on the timing when he declined to enter into a discussion of whether bulk powers might be trimmed:
“I have reflected on whether there might be scope for recommending the “trimming” of some of the bulk powers, for example by describing types of conduct that should never be authorised, or by seeking to limit the downstream use that may be made of collected material. But particularly at this late stage of the parliamentary process, I have not thought it appropriate to start down that path. Technology and terminology will inevitably change faster than the ability of legislators to keep up. The scheme of the Bill, which it is not my business to disrupt, is of broad future-proofed powers, detailed codes of practice and strong and vigorous safeguards. If the new law is to have any hope of accommodating the evolution of technology over the next 10 or 15 years, it needs to avoid the trap of an excessively prescriptive and technically-defined approach.”
In the event the legislation was flagged through on the Bulk Powers Review’s finding that the powers have a clear operational purpose and that the bulk interception power is of vital utility.
Fully equipped scrutiny at an early stage of the Parliamentary process could have resulted in more closely tailored bulk powers. As discussed below (“Vulnerability to legal challenge”) breadth of powers may come back to haunt the government in the courts.
Mandatory data retention
Views on expanded powers to compel communications data retention are highly polarised. But swan or turkey, data retention will become an issue in the courts. The CJEU judgment in Watson/Tele2, although about the existing DRIPA legislation, will require changes to the IP Act. How extensive those changes need to be will no doubt be controversial and may lead to new legal challenges. So, most likely, will the extension of mandatory data retention to include generation and obtaining of so-called internet connection records: site-level web browsing histories.  
Many would say that officially mandated lists of what we have been reading, be that paper books or websites, cross a red line. In human rights terms that could amount to failure to respect the essence of privacy and freedom of expression: a power that no amount of necessity, proportionality, oversight or safeguarding can legitimise.
Limits on powers v safeguards
The Act is underpinned by the assumption that breadth of powers can be counterbalanced by safeguards (independent prior approval, access restrictions, oversight) and soft limits on their exercise (necessity and proportionality). 
Those may provide protection against abuse. That is of little comfort if the objection is to a kind of intended use: for instance mining the communications data of millions in order to form suspicions, rather than starting with grounds for specific suspicion.
The broader and less specific the power, the more likely it is that some intended but unforeseen or unappreciated use of it will be authorised without prior public awareness and consent. That happened with S.94 of the Telecommunications Act 1984 and, arguably, with bulk interception under RIPA. Certainly, the coming together of the internet and mobile phones resulted in a shift in the intrusion and privacy balance embodied in the RIPA powers. This was facilitated by the deliberate future-proofing of RIPA powers to allow for technological change, an approach repeated (not to its benefit, I would argue) in the IP Act.
In A Question of Trust David Anderson speculated on a future Panopticon of high tech intrusive surveillance powers:
“Much of this is technically possible, or plausible. The impact of such powers on the innocent could be mitigated by the usual apparatus of safeguards, regulators and Codes of Practice. But a country constructed on such a basis would surely be intolerable to many of its inhabitants. A state that enjoyed all those powers would be truly totalitarian, even if the authorities had the best interests of its people at heart.”
He went on to say, in relation to controlling the exercise of powers by reference to fundamental rights principles of necessity and proportionality:
“Because those concepts as developed by the courts are adaptable, nuanced and context-specific, they are well adapted to balancing the competing imperatives of privacy and security. But for the same reasons, they can appear flexible, and capable of subjective application. As a means of imposing strict limits on state power (my second principle, above) they are less certain, and more contestable, than hard-edged rules of a more absolute nature would be.”
The IP Act abjures hard-edged rules. Instead it grants broad powers mitigated by safeguards and by the day to day application of soft limits: necessity and proportionality.
The philosophy of granting broad powers counterbalanced by safeguards and soft limits reflects a belief that, because the UK has a long tradition of respect for liberty, we can and should trust our authorities, suitably overseen, with powers that we would not wish to see in less scrupulous hands. 
Another view is that the mark of a society with a long tradition of respect for liberty is that it draws clear red lines. It does not grant overly broad or far-reaching powers to state authorities, however much we may believe we can trust them (and their supervisors) and however many safeguards against abuse we may install. 
Both approaches are rooted in a belief (however optimistic that may sometimes seem) that our society is founded on deeply embedded principles of liberty. Yet they lead to markedly different rhetoric and results.
Be that as it may, the IP Act grants broad general powers. Will the Act foster trust in the system that it sets up? 
The question of trust
David Anderson’s original Review was framed as “A Question of Trust”. Although we may believe a system to be operated by dedicated public servants of goodwill and integrity, nevertheless for the sceptic the answer to the question of trust posed by intrusive state powers is found in a version of the precautionary principle: the price of liberty is eternal vigilance.
Whoever may have coined that phrase, the slavery abolitionist Wendell Phillips in 1852 emphasised that it concerns the people at large as well as institutions:
“Eternal vigilance is the price of liberty; … Only by continued oversight can the democrat in office be prevented from hardening into a despot; only by unintermitted agitation can a people be sufficiently awake to principle not to let liberty be smothered in material prosperity.”
Even those less inclined to scepticism may think that a system of broad, general powers and soft limits merits a less generous presumption of trust than specifically limited, concretely defined powers. 
Either way a heavy burden is placed on oversight bodies to ensure openness and transparency. To quote A Question of Trust: “…trust depends on verification rather than reputation, …”. 
One specific point deserves highlighting: the effectiveness of the 5 year review provided for by the IP Act will depend upon sufficient information about the operation of the Act being available for evaluation.
Hidden legal interpretations
Transparency brings us to the question of hidden legal interpretations. The Act leaves it up to the new oversight body whether or not proactively to seek out and publish material legal interpretations on the basis of which powers are exercised or asserted
That this can be done is evident from the 2014 Report of Sir Mark Waller, the Intelligence Services Commissioner, in which he discusses whether there is a legal basis for thematic property interference warrants. That, however, is a beacon in the darkness. Several controversial legal interpretations were hidden until the aftermath of Snowden forced them into public light. 
David Anderson QC in his post-Act reflections has highlighted this as a “jury is out” point, emphasising that “the government must publicise (or the new Commission must prise out of it)” its internal interpretations of technical or controversial concepts in the new legislation. In A Question of Trust he had recommended that public authorities should consider how they could better inform Parliament and the public about how they interpret powers.
Realistically we cannot safely rely on government to do it. The Act includes a raft of new secrecy provisions behind which legal interpretations of matters such as who applies end to end encryption (the service provider or the user), the meaning of ‘internet communications service’, the dividing line between content and secondary data and other contentious points could remain hidden from public view. It will be interesting to see whether the future Investigatory Powers Commission will make a public commitment to implement the proposal.
Vulnerability to legal challenge
In the result the Act is long on safeguards but short on limits to powers. This structure looks increasingly likely to run into legal problems. 
Take the bulk interception warrant-issuing power. It encompasses a variety of differing techniques. They range from real-time application of 'strong selectors' at the point of interception (akin to multiple simultaneous targeted interception), through to pure ‘target discovery’: pattern analysis and anomaly detection designed to detect suspicious behaviour, perhaps in the future using machine learning and predictive analytics. Between the two ends of the spectrum are seeded analysis techniques, applied to current and historic bulk data, where the starting point for the investigation is an item of information associated with known or suspected wrongdoing.
The Act makes no differentiation between these different techniques. It is framed at an altogether higher level: necessity for general purposes (national security, alone or in conjunction with serious crime or UK economic well-being), proportionality and the like.
Statutory bulk powers could be differentiated and limited. For instance distinctions could be made between seeded and unseeded data mining. If pattern recognition and anomaly detection is valuable for detecting computerised cyber attacks, legislation could specify its use for that purpose and restrict others. Such limitations could prevent it being used for attempting to detect and predict suspicious behaviour in the general population, Minority Report-style. 
  
The lack of any such differentiation or limitation in relation to specific kinds of bulk technique renders the Act potentially vulnerable to future human rights challenges. Human rights courts are already suggesting that if bulk collection is not inherently repugnant, then at least the powers that enable it must be limited and differentiated.
Thus in Schrems the CJEU (echoing similar comments in Digital Rights Ireland at [57]) said:
“…legislation is not limited to what is strictly necessary where it authorises, on a generalised basis, storage … without any differentiation, limitation or exception being made in the light of the objective pursued.” (emphasis added)
The same principles are elaborated in the CJEU’s recent Watson/Tele2 judgment, criticising mandatory bulk communication data retention:
“It is comprehensive in that it affects all persons using electronic communication services, even though those persons are not, even indirectly, in a situation that is liable to give rise to criminal proceedings. It therefore applies even to persons for whom there is no evidence capable of suggesting that their conduct might have a link, even an indirect or remote one, with serious criminal offences. Further, it does not provide for any exception, and consequently it applies even to persons whose communications are subject, according to rules of national law, to the obligation of professional secrecy ….
106 Such legislation does not require there to be any relationship between the data which must be retained and a threat to public security. In particular, it is not restricted to retention in relation to (i) data pertaining to a particular time period and/or geographical area and/or a group of persons likely to be involved, in one way or another, in a serious crime, or (ii) persons who could, for other reasons, contribute, through their data being retained, to fighting crime …” (emphasis added)
The CJEU is also due to rule on the proposed agreement between the EU and Canada over sharing of Passenger Names Records (PNR data). The particular interest of the PNR case is that the techniques intended to be applied to bulk PNR data are similar to the kind of generalised target discovery techniques that could be applied to bulk data obtained under the IP Act powers. As described by Advocate General Mengozzi in his Opinion of 8 September 2016 this involves cross-checking PNR data with scenarios or profile types of persons at risk:
“… the actual interest of PNR schemes … is specifically to guarantee the bulk transfer of data that will allow the competent authorities to identify, with the assistance of automated processing and scenario tools or predetermined assessment criteria, individuals not known to the law enforcement services who may nonetheless present an ‘interest’ or a risk to public security and who are therefore liable to be subjected subsequently to more thorough individual checks.”
AG Mengozzi recommends that the Agreement must (among other things):
- set out clear and precise categories of data to be collected (and exclude sensitive data)
- include an exhaustive list of offences that would entitled the authorities to process PNR data
- in order to minimise ‘false positives’ generated by automated processing, contain principles and explicit rules:
  • concerning scenarios, predetermined assessment criteria and databases with which PNR would be compared, which must
  • to a large extent make it possible to arrive at results targeting individuals who might be under a reasonable suspicion of participating in terrorism or serious transnational crime, and which must
  • not be based on an individual’s racial or ethnic origin, his political opinions, his religion or philosophical beliefs, his membership of a trade union, his health or his sexual orientation.
As bulk powers come under greater scrutiny it seems likely that questions of limitation and differentiation of powers will come more strongly to the fore. The IP Act’s philosophy of broad powers counterbalanced with safeguards and soft limits may have produced legislation too generalised in scope and reach to pass muster.

Success in getting broad generally framed powers onto the statute book, though it may please the government in the short term, may be storing up future problems in the courts. One wonders whether, in a few years’ time, the government will come to regret not having fashioned a more specifically limited and differentiated set of powers.

[Amended 31 December 2016 to make clear that not all of RIPA is replaced.]