Sunday, 21 May 2017

Time to speak up for Article 15

Article 15 of the ECommerce Directive lays down the basic principle that EU Member States cannot impose a general obligation on internet intermediaries to monitor what people say online. We in the UK may have to start worrying for Article 15. It could easily be overlooked, or even deliberately left behind, when we start the process of converting EU to domestic UK law in preparation for leaving the EU. 

Article 15 is a strong candidate for the most significant piece of internet law in the UK and continental Europe. It is the stent that keeps the arteries of the internet open. It prevents the state from turning internet gateways into checkpoints at which the flow of information could be filtered, controlled and blocked.

The principle embodied in Article 15 is currently under pressure: from policymakers within and outside Brussels, from antagonistic business sectors, from the security establishment and potentially from all manner of speech prohibitionists. The common theme is that online intermediaries – ISPs, telecommunications operators, social media platforms - are gatekeepers who can and should be pressed into active service of the protagonists’ various causes.

Article 15 stands in the way of the blunt instrument of compulsory general monitoring and filtering. It does so not for the benefit of commercial platforms and ISPs, but to fulfil the policy aim of protecting the free flow of information and ultimately the freedom of speech of internet users.

Freedom of expression is not just any old policy aim, but a universal value at the heart of human rights – whether we look at Article 19 of the Universal Declaration of Human Rights, Article 10 of the European Convention, Article 11 of the EU Charter, the US First Amendment or the unwritten British historical attachment to freedom of the press. It is particularly precious because, for better or worse, speech reflects our very selves. “Give me the liberty to know, to utter, and to argue freely according to conscience, above all liberties.” (John Milton)

Conversely, freedom of expression has always been threatened by governments whose first instinct is to control. That is one reason why, perhaps more so than for any other human right, defenders of free speech find themselves taking principled stands on the most unattractive ground. “Because if you don't stand up for the stuff you don't like, when they come for the stuff you do like, you've already lost.” (Neil Gaiman)

The peculiar vice of compelled general monitoring, however, is that we never get to that point. If the filtered and blocked speech doesn’t see the light of day it never gets to be debated, prosecuted, tested, criticised or defended. To some, that may be a virtue not a vice.

Where freedom of speech is concerned, if principle is allowed to take second place to the exigencies of the moment we find ourselves not so much on a slippery slope as in a headlong rush down the Cresta Run. So it is with Article 15. The queue of noble causes on whose behalf we are urged to compel gateways to be gatekeepers - countering copyright infringement, trolling, hate speech, terrorism, pornography, fake news and the rest - stretches round the block.

We defend the right to bad speech for the sake of the good. We understand the impossibility of drawing a bright line between bad and good speech. We regulate bad speech only at the peril of the good. The peril is greater when the regulatory implement of choice is a tool as blunt as general monitoring.

Article 15 lays down a principle that applies across the board, from copyright to terrorism. EU Member States must not impose on internet intermediaries (conduits, hosts and network caches) a general obligation to monitor or actively to seek facts or circumstances indicating illegal activity. Intermediaries cannot be made to snuffle around their systems looking for unlawful activities. Article 15 goes hand in hand with the Directive’s liability shields under which conduits, hosts and network caches have various degrees of protection from criminal and civil liability for the activities of their users.

It is only too easy for policymakers to point the finger at intermediaries and demand that they do more to control the unpleasant and sometimes illegal things that people do on their systems. Policymakers see intermediaries as points of least cost enforcement: it is more efficient to enforce at a chokepoint than to chase tens of thousands of individual wrongdoers. The theory is made explicit in Recital (59) of the EU Copyright in the Information Society Directive:

“In the digital environment, in particular, the services of intermediaries may increasingly be used by third parties for infringing activities. In many cases such intermediaries are best placed to bring such infringing activities to an end.”
Mr Justice Arnold in Cartier explained the policy that underlies Recital (59):
“As can be seen from recital (59) to the Information Society Directive, the economic logic of granting injunctions against intermediaries such as ISPs is that they are the "lowest cost avoiders" of infringement. That is to say, it is economically more efficient to require intermediaries to take action to prevent infringement occurring via their services than it is to require rightholders to take action directly against infringers. Whether that is correct as a matter of economics is not for me to judge. Nor is it for me to judge whether it is good policy in other ways. That judgement has already been made by the legislators …”
At the same time, Article 15 of the ECommerce Directive constrains the breadth of injunctions that courts can grant against intermediaries under the Copyright and Enforcement Directives. The effect of Article 15 can be seen in the ECJ decisions of SABAM v Scarlet and SABAM v Netlog prohibiting content filtering injunctions, and in Arnold J’s Cartier judgment itself:
“If ISPs could be required to block websites without having actual knowledge of infringing activity, that would be tantamount to a general obligation to monitor.”
But if intermediaries are best placed to stop infringement, why should Article 15 constrain what can be imposed on them? Why shouldn’t the intermediaries be required to monitor?

The only sense in which intermediaries could be seen as best placed is that, since users’ communications flow through their systems, they have the potential to be technical chokepoints. In every other respect intermediaries are poorly placed to make decisions on legality of content and thus on what to block.

Intermediary enforcement risks exaggerating the ease with which unlawful behaviour can be identified, often assuming that illegal content is identifiable simply by looking at it. In relatively few categories is illegality manifest. Legality is predominantly a matter of factual investigation and judgement. That is why it is preferable to have independent courts ruling on matters of illegality rather than compelling private platforms to attempt it and have them overblock out of fear of liability or sanctions.

A too narrowly focused cost analysis tends to underplay or even ignore the negative externalities and unintended consequences of compelling gateways to act as gatekeepers. It excludes any broader implications of reinforcing chokepoints, the creation of a climate in which playing gatekeeper on behalf of the state and its proxies becomes the norm. In a broader context the least cost enforcer may turn out to be highest cost.

Notice-based intermediary liability systems result in material being removed before a court determines whether it is unlawful. That already carries a risk of overcautious blocking or removal. Compelled proactive monitoring and filtering, since it blocks information about which no complaint has been made, moves the scale of risk to another level. It is akin to prior restraint on a grand scale, conducted not by courts after hearing evidence but by private entities made to act as investigator, prosecutor, judge, jury and executioner.

Our aversion to prior restraint reflects also that the public are sometimes well served by the airing of something that at first blush might appear to be against the strict letter of the law. Speech may be rendered lawful by a public interest defence, or by fundamental freedom of speech considerations. Or a court might decide that even though unlawful the appropriate remedy is damages but not removal. Legality of speech, even in areas such as copyright, can be a heavily nuanced matter. Proactive general monitoring obligations allow for no such subtlety.

Some may argue that in modern times the quid pro quo for living with freedom of speech has been that speech is generally mediated through professional, responsible editors. And that we need to put that genie back in the bottle by converting online intermediaries into editors and publishers, responsible for what other people say on their platforms.

Never mind whether that could be achieved, the argument misunderstands the nature of freedom of expression. The great advance of the internet has been to bring about something akin to the golden age of pamphleteering, freeing mass individual speech from the grip of the mass media. District Judge Dalzell was right when, in ACLU v Reno, he said:

“As the most participatory form of mass speech yet developed, the internet deserves the highest protection from governmental intrusion.”
The US Supreme Court in the same case said:
“Through the use of chat rooms, any person with a phone line can become a town crier with a voice that resonates farther than it could from any soapbox. Through the use of Web pages, mail exploders, and newsgroups, the same individual can become a pamphleteer.”
Those quotations were from 1996 and 1997. They are, if anything, more relevant now. Individual, unmediated speech deserves more, not less, protection than the traditional press.

It may be discomfiting that the kind of vitriol that used to be confined to Speaker's Corner can now reach an audience of millions. But freedom of individual speech was never something only to be tolerated as a tourist curiosity, or indulged as long as it was hidden away in a pub saloon bar. Nor, as we know from the ECtHR decision in Handyside, is freedom of expression confined to that which would not offend in a genteel drawing room.

Article 19 of the 1948 Universal Convention on Human Rights is not predicated on the assumption of mediated speech. It articulates an individual, personal right that transcends place, time and medium and could have been written with the internet in mind:

“Everyone has the right to freedom of opinion and expression; this right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media and regardless of frontiers.”
Article 15 stands squarely in the path of compelling mediated speech through the instrument of general monitoring. So why might it be vulnerable to being overlooked in the Brexit process?

Caveat: what follows is based on the existing Conservative government’s plans for a Great Repeal Bill and is subject to the outcome of the General Election.

Article 15 and Brexit

Article 15 operates at several levels. If a Member State were to legislate in breach of Article 15, a national court would be obliged to disapply the legislation. So it acts as a powerful constraint on Member States at the policy and legislative level. As we have seen it also constrains Member States’ courts. They cannot issue an order that would impose a general monitoring obligation. They cannot interpret a domestic statute or develop a common law obligation in a way that is contrary to Article 15. Regulatory and enforcement bodies are similarly constrained.

The Brexit starting point is that the incumbent government has committed to continuing existing EU law through the Great Repeal Bill. Theresa May says in the introduction to the Great Repeal Bill White Paper:

“Our decision to convert the ‘acquis’ – the body of European legislation – into UK law at the moment we repeal the European Communities Act is an essential part of this plan.
This approach will provide maximum certainty as we leave the EU. The same rules and laws will apply on the day after exit as on the day before. It will then be for democratically elected representatives in the UK to decide on any changes to that law, after full scrutiny and proper debate.
… Th[e Great Repeal] Bill will, wherever practical and appropriate, convert EU law into UK law from the day we leave so that we can make the right decisions in the national interest at a time that we choose.”
On that basis Article 15 ought to be continued post-Brexit. However there is a technical problem. Although it is in a Directive, and so was required to be implemented in UK law, the text of Article 15 appears nowhere in UK domestic legislation. Depending on how the proposed Great Repeal Bill is drafted, Article 15 may have to be specifically written in to UK legislation in order to continue post-Brexit.

The White Paper recognises the need to write EU Regulations into domestic law, but appears to assume that since a Directive will already have been implemented in UK domestic law it just needs to be preserved post-Brexit:

“• the Bill will convert directly-applicable EU law (EU regulations) into UK law;

• it will preserve all the laws we have made in the UK to implement our EU obligations”

Article 15 could run the risk of falling between the cracks.

In any event the desirability of continuing Article 15 may not be universally accepted. UK Music, in its ‘Music 2017 Manifesto’, has noted the opportunity that Brexit presents to ‘place responsibility on internet service providers and require them to have a duty of care for copyright protected music’. If that implies proactive monitoring it would put Article 15 in question. Where one industry leads others may follow. A government interested for its own purposes in turning the screw on intermediaries might not welcome the impediment of Article 15. It might be tempted to invoke the ‘wherever practicable and appropriate’ White Paper qualification on continuation of existing EU law.

“Freedom of expression is not self-perpetuating, but rather has to be maintained through the constant vigilance of those who care about it.” So said Index on Censorship in 1972. The run-up to Brexit may be a time for especial vigilance.

Monday, 8 May 2017

Back doors, black boxes and #IPAct technical capability regulations

The Home Office has launched an under-the-radar consultation on a critical step in the implementation of the Investigatory Powers Act (IPAct): the regulations on technical capability notices. The Open Rights Group has recently revealed details of the proposed regulations.

Under the IPAct a technical capability notice can be issued to a telecommunications operator by the Secretary of State, with the approval of a Judicial Commissioner. A notice would require the operator to install specified technical facilities. The objective is to ensure that if the operator subsequently receives, say, an interception warrant it has the technical ability to comply with it. A technical capability notice does not itself require an operator to conduct an interception. It prepares the ground in advance by ensuring the operator has equipment in place.

The proposed regulations will spell out what kind of facilities a technical capability notice can require a telecommunications operator to install. For example, the consultation touches on one of the many controversial topics in the IPAct: the possible use of technical capability notices in effect to prevent telecommunications operators from providing users with end to end encryption facilities.

Telecommunications operators are widely defined in the IPAct to include not only telcos, ISPs and the like but also web e-mail, social media platforms, cloud hosts and over the top communications providers.

Technical capability notices already exist, but in a much more limited form, under the Regulation of Investigatory Powers Act 2000 (RIPA). S.12 of RIPA enacted a three layer scheme similar to that under the new IPAct:

  • first the statute, laying out in broad terms the Home Office’s powers to require an operator to install an interception capability;
  • second, regulations made under the Act. These put more flesh on the obligations and potentially narrow the categories of provider who could be made subject to a notice;
  • third, technical capability notices themselves, issued by the Secretary of State to individual service providers (but not necessarily to all of those within scope of the Act or the regulations).
These pave the way for actual interception warrants, requiring operators to carry out particular interceptions.

The main change with the IPAct is that technical capability notices are no longer limited to interception. They apply to three of the powers under the Act: interception (targeted, thematic and bulk), communications data acquisition (ordinary and bulk) and equipment interference (targeted, thematic and bulk).

Another high level change is that the IPAct allows technical capability notices to be given to private as well as to public telecommunications providers. The draft regulations reflect this expansion.

Also, unlike under RIPA, IPAct technical capability notices have to be approved by a Judicial Commissioner.

The proposed IPAct regulations are in many respects similar to the existing 2002 regulations made under RIPA. However there are some significant differences.

Communications data acquisition capability not subject to 10,000 person threshold

The existing RIPA interception capability regulations set a 10,000 person threshold below which an interception capability cannot be required. (It has never been very clear whether this referred to customers or end-users.) The proposed new regulations repeat this threshold for interception and equipment interference, albeit removing the existing limitation that the 10,000 persons be within the UK.

For communications data acquisition, however, the new draft IPAct regulations set no minimum threshold. Combine this with the IPAct’s enlarged scope, covering private and public telecommunications operators, and we have the startling prospect that any kind of organisation, business (other than excluded financial services businesses), institution, university, school, hospital, library, political party and so on could potentially be required to install a communications data acquisition capability. In theory this could even apply to private households, although it is difficult to imagine this ever being thought appropriate.

Communications data acquisition ‘black box’

The communications data acquisition aspects of the draft regulations differ from interception and equipment interference in another significant respect. The existing RIPA interception regulations are framed as obligations on operators to provide the capability themselves. The same is true of the new IPAct interception and equipment interference obligations. This approach allows operators to design or procure their own interception equipment, so long as it complies with the technical capability notice. 

The new IPAct communications data requirements, however, include a paragraph under which a technical capability notice could require a provider to install a government-provided ‘black box’:

“10. To install and maintain any apparatus provided to the operator by or on behalf of the Secretary of State for the purpose of enabling the operator to obtain or disclose communications data, including by providing and maintaining any apparatus, systems or other facilities or services necessary to install and maintain any apparatus so provided.”
This paragraph, unheralded during the Bill’s passage though Parliament, applies to both ordinary and bulk communications data acquisition capabilities. It is a substantial departure in kind from previous RIPA obligations.

New services

Unsurprisingly, since this was heavily trailed during the passage of the Bill, all three sets of provisions allow the imposition of obligations to notify the Home Office in advance of new and changed services. A technical capability notice would also be able to require the operator to “consider” the obligations and requirements imposed by any technical capability notice when designing or developing new telecommunications services or telecommunications systems.

The 2002 regulations contained no obligations of this kind.

End to end encryption

The most controversial aspect of technical capability notices throughout the passage of the Bill was whether the obligation to remove encryption could be used to prevent use of end to end encryption. On this topic the IP Act and the draft regulations in fact mirror quite closely an obligation that was always in the existing 2002 RIPA regulations:

“10. To ensure that the person on whose application the interception warrant was issued is able to remove any electronic protection applied by the service provider to the intercepted communication and the related communications data.”
The proposed IP Act regulations say (for interception):
“8. To provide and maintain the capability to disclose, where practicable, the content of communications or secondary data in an intelligible form and to remove electronic protection applied by or on behalf of the telecommunications operator to the communications or data, or to permit the person to whom the warrant is addressed to remove such electronic protection.”
However while standalone end to end encryption software existed in 2002 it would not have been touched by the 2002 regulations since the encryption was not applied by a communications service provider. Only comparatively recently have communications service providers offered their customers the ability to use end to end encryption, where the service provider does not have and never has had an encryption key. 

This development has given rise to questions about whether a technical capability notice under the IP Act could be used to require a telecommunications operator to have a means of decrypting messages, effectively preventing it from providing end to end encryption facilities to its customers.

In Parliament the issue surfaced repeatedly during the passage of the Bill, culminating in a House of Lords debate on 19 October 2016 in which Home Office Minister Earl Howe was subjected to tenacious questioning from Lord Harris of Haringey.

The question of whether technical capability notices could be used in this way has never been satisfactorily resolved. The Home Office has repeatedly (and correctly) emphasised that the obligation can only apply to encryption ‘applied by or on behalf of’ the service provider. But it has never clarified when encryption would be regarded as applied by the provider and when by the user. Perhaps the closest it came was in the House of Lords debate when Earl Howe said:

“Any decision will have regard to the particular circumstances of the case, recognising that there are many different models of encryption, including many different models of end-to-end encryption, and that what is reasonably practicable for one telecommunications operator may not be for another.”
In that passage and elsewhere the Home Office has stressed that a service provider cannot be made to do anything that is not ‘reasonably practicable’. Thus Earl Howe, again in the House of Lords debate, said:
“… the company on whom the warrant is served will not be required to take any steps, such as the removal of encryption, if they are not reasonably practicable steps for that company to take. So a technical capability notice could not, in itself, authorise an interference with privacy. It would simply require a capability to be maintained that would allow a telecommunications operator to give effect to a warrant quickly and securely including, where applicable, the ability to remove encryption.”
He added:
“These safeguards ensure that an obligation to remove encryption under Clause 229 of the Bill will be subject to very strict controls and may be imposed only where it is necessary and proportionate, technically feasible and reasonably practicable for the relevant operator to comply.”
Later on he said:
“The Bill ensures that the Secretary of State must specifically consider the cost and technical feasibility of complying with an obligation to remove encryption as well as whether it is reasonably practicable.”
However it is important not to conflate the technical capability notice and a subsequent warrant. The raison d’etre of a technical capability notice is to achieve a situation in which it is practicable for a service provider to assist with a warrant (see IPAct s. 253(4)).  The obligations in the draft regulations are those that the Secretary of State considers reasonable to impose for that purpose.  When issuing a technical capability notice the Secretary of State has to consider, among other things, technical feasibility and cost.

The Act does provide that a warrant cannot require a service provider to do something that is not reasonably practicable. But a warrant is not a technical capability notice. Crucially, the Act lays down that where a technical capability notice is in place, reasonable practicability of assisting with a warrant is to be judged on the assumption that the technical capability notice has been complied with.

Thus for ordinary (non-bulk) interception S. 43(4) and (6) provide:

“(4) The relevant operator is not required to take any steps which it is not reasonably practicable for the relevant operator to take.” 
“(6) Where obligations have been imposed on a relevant operator (“P”) under section 253 (technical capability notices), for the purposes of subsection (4) the steps which it is reasonably practicable for P to take include every step which it would have been reasonably practicable for P to take if P had complied with all of those obligations.” 
For a technical capability notice the central concept is technical feasibility.

Clearly it is not technically feasible for an operator who provides its users with true end-to-end encryption facilities to remove the encryption, since it has no decryption key.

But what if the Home Office were to argue that it was technically feasible for the operator to adopt a different encryption model under which it had a key? If that argument held up then the service provider would (subject to the ‘applied by or on behalf of’ point) have to stop offering true end to end encryption facilities in order to comply with a notice. If it did not cease, then if it received a warrant it would be of no avail to say that it was not reasonably practicable to remove the encryption, since the Act would deem it to have complied with the technical capability notice.

Whether a technical capability notice could be used to require a provider to change the nature of a service that it was offering in this way is one of the great imponderables of this part of the legislation. The draft regulations shed no more light on the matter.

This is an area in which the interpretation that the Home Office places on the Act and the final regulations could be critical. The new oversight body could have an important role in proactively seeking out such interpretations and bringing them to public notice.

Equipment interference

A major change compared with the 2002 regulations is the extension of technical capability notices beyond the existing area of interception. The proposed regulations cover, as well as communications data acquisition already discussed, equipment interference aimed at obtaining communications, equipment data and other information. This is no surprise, since that is one of the changes introduced by the IPAct itself.

Nevertheless the idea that a telecommunications operator can be compelled to put in place technical facilities specifically to enable authorities to hack any equipment under a warrant remains surprising. This equipment interference obligation, perhaps more so than removal of encryption, deserves the epithet ‘back door’.

Notably, given the security concerns that would no doubt accompany the provision of a hacking gateway for the authorities, as with interception and communications data acquisition the draft regulations provide that an equipment interference capability notice can include a requirement to comply with security standards specified in the notice and any guidance issued by the Secretary of State. 

Under S.2(2)(c) of the IPAct the Secretary of State has a duty to have regard to the public interest in the integrity and security of telecommunication systems.

Consultation process

Under S.253(6) of the IPAct the Home Secretary must consult on the draft regulations. She is required to consult the Technical Advisory Board set up under the Act, operators 'appearing likely to the Secretary of State to be likely to be subject to any obligations specified in the regulations' and their representatives, and persons with relevant statutory functions (an example would presumably be the new Investigatory Powers Commissioner).

Notably absent from the must-consult list are the general public (who most of all stand to be affected by the Act) or any organisations representing the public in areas such as privacy and civil liberties. However, now that the proposed regulations have reached a wider audience than the must-consult list, more broadly based comment can be expected.

One point of interest is how far the Home Office’s statutory ‘must-consult’ obligation reaches. This is especially pertinent when, as already highlighted, the part of the draft regulations that deals with acquisition of communications data does not contain a 10,000 person minimum threshold.

So unlike for equipment interference and interception, which do specify a minimum 10,000 person limit, the communications data acquisition capability provisions (including the ability to require installation of a government-supplied 'black box') can be applied however few customers or users an operator may have. Moreover the obligations are not restricted to public operators. Private telecommunications operators can be included. As we have seen, thanks to the Act's wide definition of telecommunications operator that could cover many kinds of organisations.

This may create a conundrum. If it does not appear to Home Secretary that private or small operators are likely to be subject to any obligations specified in the regulations, then she does not have to consult them or their representatives. But in that event, what would be the purpose of extending the scope of the regulations, specifically for communications data acquisition, to include all operators large or small, private or public and apparently including organisations outside the traditional telco and ISP sectors? That could affect the scope of the consultation that the Secretary of State is obliged to undertake. 

Wednesday, 18 January 2017

Internet legal developments to look out for in 2017

A preview of some of the UK internet legal developments that we can expect in 2017. Any proposed EU legislation will be subject to Brexit considerations and so may never happen in the UK.

EU copyright reform In 2016 the European Commission published
proposals for a Directive on Copyright in the Digital Single Market (widely viewed as being in the main internet-unfriendly), for a Regulation extending the country of origin provisions of the Satellite and Cable Broadcasting Directive to broadcasters' ancillary online transmissions and for a proposal to mandate a degree of online content portability within the EU. The legislative processes will continue through 2017.

EU online business As part of its Digital Single Market proposals the European Commission has published a proposal for a Regulation on "Geo-blocking and other forms of discrimination". It aims to prevent online retailers from discriminating, technically or commercially, on the basis of nationality, residence or location of a customer. 

UK criminal copyright infringement The
Digital Economy Bill is about to start its Lords Committee stage. Among other things the Bill implements the government’s decision to seek an increase in the maximum sentence for criminal copyright infringement by communication to the public from two years to ten years. The Bill also redefines the offence in a way that, although intended to exclude minor infringements, has raised concerns that it in fact expands the scope of the offence.

Pending CJEU copyright cases Several copyright references are pending in the EU Court of Justice. Issues under consideration include communication to the public and magnet links (
BREIN/Pirate Bay C-610/15 [AG Opinion delivered 8 February 2017]), links to infringing movies in an add-on media player (BREIN/Filmspeler C-527/15 [AG Opinion delivered 8 December 2016]), site blocking injunctions (BREIN/Pirate Bay), applicability of the temporary copies exception to viewing infringing movies (BREIN/Filmspeler) and cloud-based remote PVR (VCAST C-265/16).

Online pornography The
Digital Economy Bill would grant powers to a regulator (intended to be the British Board of Film Classification) to determine age control mechanisms for internet sites that make ‘R18’ pornography available; and to direct ISPs to block such sites that either do not comply with age verification or contain material that would not be granted an R18 certificate. These aspects of the Bill have been criticised by the UN Special Rapporteur on freedom of expression, by the House of Lords Delegated Powers and Regulatory Reform Committee and by the House of Lords Constitution Committee.

Net neutrality and parental controls The net neutrality provisions of the
EU Open Internet Access and Roaming Regulation potentially affect the ability of operators to choose to provide network-based parental control filtering to their customers. A transitional period for existing self-regulatory schemes expired on 31 December 2016. The government has said that although it does not regard the Regulation as outlawing the existing UK voluntary parental controls regime, to put the matter beyond doubt it will introduce an amendment to the Digital Economy Bill to put the parental controls scheme on a statutory basis. [Now Clause 91 'Internet filters'.]

TV-like regulation of the internet The review of the EU Audio Visual Media Services Directive continues. The
Commission proposal adopted on 25 May 2016 would further extend the Directive's applicability to on-demand providers and internet platforms.

Cross-border liability and jurisdiction
Ilsjan (Case C-194/16) is another CJEU reference on the Article 7(2) (ex-Art 5(3)) tort jurisdiction provisions of the EU Jurisdiction Regulation. The case concerns a claim for correction and removal of harmful comments. It asks questions around mere accessibility as a threshold for jurisdiction (as found in Pez Hejduk) and the eDate/Martinez ‘centre of interests’ criterion for recovery in respect of the entire harm suffered throughout the EU. Meanwhile significant decisions on extraterritoriality are likely to be delivered in the French Conseil d'Etat (CNIL/Google) and Canadian Supreme Court (Equustek/Google).

Online state surveillance The UK’s
Investigatory Powers Act 2016 is expected to be implemented in stages throughout 2017. The Watson/Tele2 decision of the CJEU has already cast a shadow over the data retention provisions of the Act, which will almost certainly now have to be amended. The Watson case, which directly concerns the now expired data retention provisions of DRIPA, will shortly return to the Court of Appeal for further consideration in the light of the CJEU judgment. The IP Act (in particular the bulk powers provisions) may also be indirectly affected by pending cases in the CJEU (challenges to the EU-US Privacy Shield), in the European Court of Human Rights (ten NGOs challenging the existing RIPA bulk interception regime) and by a judicial review by Privacy International of an Investigatory Powers Tribunal decision on equipment interference powers. [Judicial review application dismissed 2 February 2017; under appeal to Court of Appeal.] Finally, Liberty has announced that it is launching a direct challenge in the UK courts against the IP Act bulk powers. [New revised and simplified mindmap of legal challenges (previous version here):

 [Updated 3 March 2017 with various developments and new mindmap.]

Monday, 2 January 2017

Cyberleagle on Surveillance

For over two years I have been blogging on surveillance, a topic that cuckoo-like has grown to crowd out most other IT and internet law topics on this blog. 

With the Investigatory Powers Act now on the UK statute book, this seems like a good moment to catalogue the 43 posts that this legislation and its preceding events have inspired.

20 August 2013: Everyman encounters Government. Prompted by reactions to Snowden. It's all about trust.

{8 April 2014: CJEU invalidates EU Data Retention Directive in Digital Rights Ireland. Validity of UK implementation by secondary legislation questionable.}

12 July 2014: Dissecting DRIP - the emergency Data Retention and Investigatory Powers Bill. Posted the day after the coalition government’s Friday publication of the DRIP Bill for introduction into Parliament on the Monday morning, on an emergency four day timetable. Still by far the most page views of any post on this blog. 

20 July 2014: The other side of communications data. Statistics on communications data acquisition errors with serious consequences: wrong accusations, search warrants, arrests. Updated since then with data from subsequent IOCCO Annual Reports. 

10 October 2014: Submissions to the Investigatory Powers Review. Various submissions (including mine) to David Anderson QC’s Investigatory Powers Review.

15 November 2014: Of straws and haystacks. Tracing the history of RIPA’s S.8(4) bulk interception power via the 1960s cable vetting scandal to S.4 of the Official Secrets Act 1920.

3 December 2014: Another round of data retention. The IP address resolution provisions of the Counter-Terrorism and Security Bill, amending DRIPA.

21 December 2014: A Cheltenham Carol. Five Ba-a-aack Doors.

2 January 2015: The tangled net of GCHQ’s fishing warrant. Detailed analysis of the S.8(4) RIPA bulk interception warrant.

2 February 2015: IP address resolution - a conundrum still unresolved? A short rant about the Counter-Terrorism and Security Bill.

{11 June 2015: "A Question of Trust" published.}

13 July 2015: Red lines and no-go zones - the coming surveillance debate. Discussion of  David Anderson Q.C.'s Investigatory Powers Review report "A Question of Trust".

12 August 2015: The coming surveillance debate. A 13 part series of posts analysing specific topics likely to feature in the forthcoming Investigatory Powers Bill.

5 September 2015: Predicting the UK’s new surveillance law. Nine predictions for the contents of the Bill covering bulk interception, broad Ministerial powers, browsing histories, digital footprints, data generation by decree, communications data/content boundary, third party data collection, request filter and judicial authorisation.

{4 November 2015: Draft Investigatory Powers Bill published.}

4 November 2015: Prediction and Verdict - the draft Investigatory Powers Bill. Contents of the draft Bill versus my 5 September predictions.

9 November 2015: From Oversight to Insight - Hidden Surveillance Law Interpretations. Arguing that the oversight body should proactively seek out and make public material legal interpretations on the basis of which powers are exercised or asserted.

23 December 2015: #IPBill Christmas Quiz. A bit of seasonal fun with the draft Bill, including the never to be forgotten definition “Data includes any information which is not data”. Five out of the ten points highlighted, including that one, have changed in the final legislation.

16 January 2016: An itemised phone bill like none ever seen. Adapted from my evidence to the pre-legislative scrutiny Joint Committee, analysing how internet connection records are richer, more far reaching and different in nature from the traditional itemised phone bill with which the government was at that stage inclined to compare them. 

7 February 2016: No Content: Metadata and the draft Investigatory Powers Bill. Highlighting the significance of communications data powers in the draft Bill.

16 February 2016: The draft Investigatory Powers Bill - start all over again? Discussion of the Joint Committee and ISC Reports on the draft Bill.

{1 March 2016: Investigatory Powers Bill introduced into Parliament.}

15 March 2016: Relevant Communications Data revisited. Parsing and visualising one of the most complex and critical definitions in the Bill.

19 March 2016: 20 points on the Investigatory Powers Bill, from future proofing to triple negatives. Storified 20 points tweeted immediately before publication of the Bill, with subsequent comments in the light of the Bill.

24 March 2016: All about the metadata. More visualisations of the Bill’s complex web of metadata definitions.

29 March 2016: Woe unto you, cryptographers! This little collection of Biblical quotations adapted to cryptography fell flat as a pancake…

1 April 2016: An official announcement. …but not as flat as this leaden attempt at an April Fool.

15 April 2016: Future-proofing the Investigatory Powers Bill. Arguing that the Bill’s attempt to future-proof powers by adopting a technologically neutral drafting approach repeats the error of RIPA. A better approach would be to future-proof the privacy-intrusion balance.

26 May 2016: The content v metadata contest at the heart of the Investigatory Powers Bill. A deep dive into the Bill’s dividing lines between content and metadata, including the new power of the intelligence agencies to extract some content and treat it as metadata. 

12 June 2016: The List. Dystopia looms, holding a clipboard.

19 July 2016: Data retention - the Advocate General opines. Summary of the Advocate General’s Opinion in the Watson/Tele2 case challenging DRIPA and the equivalent Swedish legislation.

11 August 2016: How secondary data got its name. An imagined Bill drafting committee meeting in Whitehall.

{19 August 2016: Bulk Powers Review published.}

7 September 2016: A trim for bulk powers? What might have been if the Bulk Powers Review had been commissioned and published at the start of the Parliamentary process.

{29 November 2016: Investigatory Powers Act gains Royal Assent.}

10 December 2016: Investigatory Powers Act 2016 Christmas Quiz. 20 questions to test your knowledge of the #IPAct. 

31 December 2016: The Investigatory Powers Act - swan or turkey? A post-legislative reflection on the Act.  

This marks the end of the beginning. Pending legal challenges, new legal challenges and Brexit will provide a rich seam of material for future blogging.

[Amended 21.25 2 Jan 2017 to add some {contextual events} and stylistic edits.]

Saturday, 31 December 2016

The Investigatory Powers Act - swan or turkey?

The Investigatory Powers Bill, now the newly minted Investigatory Powers Act, has probably undergone more scrutiny than any legislation in recent memory. Rarely, though, can the need for scrutiny have been so great.
Over 300 pages make up what then Prime Minister David Cameron described as the most important Bill of the last Parliament. When it comes into force the IP Act will replace much of RIPA (the Regulation of Investigatory Powers Act 2000), described by David Anderson Q.C.’s report A Question of Trust as ‘incomprehensible to all but a tiny band of initiates’. It will also supersede a batch of non-RIPA powers that had been exercised in secret over many years - some, so the Investigatory Powers Tribunal has found, on the basis of an insufficiently clear legal framework. 
None of this would have occurred but for the 2013 Snowden revelations of the scale of GCHQ’s use of bulk interception powers. Two years post-Snowden the government was still acknowledging previously unknown (except to those in the know) uses of opaque statutory powers. 
Three Reviews and several Parliamentary Committees later, it remains a matter of opinion whether the thousands of hours of labour that went into the Act have brought forth a swan or a turkey. If the lengthy incubation has produced a swan, it is one whose feathers are already looking distinctly ruffled following the CJEU judgment in Watson/Tele2, issued three weeks after Royal Assent. That decision will at a minimum require the data retention aspects of the Act to be substantially amended. 
So, swan or turkey?
Judicial approval
On the swan side warrants for interception and equipment interference, together with most types of power exercisable by notice, will be subject to prior approval by independent Judicial Commissioners. For some, doubts persist about the degree of the scrutiny that will be exercised. Nevertheless judicial approval is a significant improvement on current practice whereby the Secretary of State alone takes the decision to issue a warrant.
Codified powers
Also swan-like is the impressive 300 page codification of the numerous powers granted to law enforcement and intelligence agencies. A Part entitled ‘Bulk warrants’ is a welcome change from RIPA’s certificated warrants, which forced the reader to play hopscotch around a mosaic of convoluted provisions before the legislation would give up its secrets.
Granted, the IP Act also ties itself in a few impenetrable knots. Parts are built on shaky or even non-existent definitional foundations. But it would be churlish not to acknowledge the IP Act’s overall improvement over its predecessors. 
Parliamentary scrutiny
When we move to consider the Parliamentary scrutiny of bulk powers things become less elegant.
The pre-legislative Joint Committee acknowledged that the witnesses were giving evidence on the basis of incomplete information. In response to the Joint Committee’s recommendation the government produced an Operational Case for Bulk Powers alongside the Bill’s introduction into Parliament. That added a little light to that which A Question of Trust had previously shed on the use of bulk powers. 
But it was only with the publication of David Anderson’s Bulk Powers Review towards the end of the Parliamentary process that greater insight into the full range of ways in which bulk powers are used was provided from an uncontroversial source. (By way of example ‘selector’ - the most basic of bulk interception terms - appears 27 times in the Bulk Powers Review, five times in A Question of Trust and twice in the Operational Case, but not at all in either the Joint Parliamentary Scrutiny Committee Report or the Intelligence and Security Committee Report.)
By the time the Bulk Powers Review was published it was too late for the detailed information within it to fuel a useful Parliamentary debate on how any bulk powers within the Act should be framed. David Anderson touched on the timing when he declined to enter into a discussion of whether bulk powers might be trimmed:
“I have reflected on whether there might be scope for recommending the “trimming” of some of the bulk powers, for example by describing types of conduct that should never be authorised, or by seeking to limit the downstream use that may be made of collected material. But particularly at this late stage of the parliamentary process, I have not thought it appropriate to start down that path. Technology and terminology will inevitably change faster than the ability of legislators to keep up. The scheme of the Bill, which it is not my business to disrupt, is of broad future-proofed powers, detailed codes of practice and strong and vigorous safeguards. If the new law is to have any hope of accommodating the evolution of technology over the next 10 or 15 years, it needs to avoid the trap of an excessively prescriptive and technically-defined approach.”
In the event the legislation was flagged through on the Bulk Powers Review’s finding that the powers have a clear operational purpose and that the bulk interception power is of vital utility.
Fully equipped scrutiny at an early stage of the Parliamentary process could have resulted in more closely tailored bulk powers. As discussed below (“Vulnerability to legal challenge”) breadth of powers may come back to haunt the government in the courts.
Mandatory data retention
Views on expanded powers to compel communications data retention are highly polarised. But swan or turkey, data retention will become an issue in the courts. The CJEU judgment in Watson/Tele2, although about the existing DRIPA legislation, will require changes to the IP Act. How extensive those changes need to be will no doubt be controversial and may lead to new legal challenges. So, most likely, will the extension of mandatory data retention to include generation and obtaining of so-called internet connection records: site-level web browsing histories.  
Many would say that officially mandated lists of what we have been reading, be that paper books or websites, cross a red line. In human rights terms that could amount to failure to respect the essence of privacy and freedom of expression: a power that no amount of necessity, proportionality, oversight or safeguarding can legitimise.
Limits on powers v safeguards
The Act is underpinned by the assumption that breadth of powers can be counterbalanced by safeguards (independent prior approval, access restrictions, oversight) and soft limits on their exercise (necessity and proportionality). 
Those may provide protection against abuse. That is of little comfort if the objection is to a kind of intended use: for instance mining the communications data of millions in order to form suspicions, rather than starting with grounds for specific suspicion.
The broader and less specific the power, the more likely it is that some intended but unforeseen or unappreciated use of it will be authorised without prior public awareness and consent. That happened with S.94 of the Telecommunications Act 1984 and, arguably, with bulk interception under RIPA. Certainly, the coming together of the internet and mobile phones resulted in a shift in the intrusion and privacy balance embodied in the RIPA powers. This was facilitated by the deliberate future-proofing of RIPA powers to allow for technological change, an approach repeated (not to its benefit, I would argue) in the IP Act.
In A Question of Trust David Anderson speculated on a future Panopticon of high tech intrusive surveillance powers:
“Much of this is technically possible, or plausible. The impact of such powers on the innocent could be mitigated by the usual apparatus of safeguards, regulators and Codes of Practice. But a country constructed on such a basis would surely be intolerable to many of its inhabitants. A state that enjoyed all those powers would be truly totalitarian, even if the authorities had the best interests of its people at heart.”
He went on to say, in relation to controlling the exercise of powers by reference to fundamental rights principles of necessity and proportionality:
“Because those concepts as developed by the courts are adaptable, nuanced and context-specific, they are well adapted to balancing the competing imperatives of privacy and security. But for the same reasons, they can appear flexible, and capable of subjective application. As a means of imposing strict limits on state power (my second principle, above) they are less certain, and more contestable, than hard-edged rules of a more absolute nature would be.”
The IP Act abjures hard-edged rules. Instead it grants broad powers mitigated by safeguards and by the day to day application of soft limits: necessity and proportionality.
The philosophy of granting broad powers counterbalanced by safeguards and soft limits reflects a belief that, because the UK has a long tradition of respect for liberty, we can and should trust our authorities, suitably overseen, with powers that we would not wish to see in less scrupulous hands. 
Another view is that the mark of a society with a long tradition of respect for liberty is that it draws clear red lines. It does not grant overly broad or far-reaching powers to state authorities, however much we may believe we can trust them (and their supervisors) and however many safeguards against abuse we may install. 
Both approaches are rooted in a belief (however optimistic that may sometimes seem) that our society is founded on deeply embedded principles of liberty. Yet they lead to markedly different rhetoric and results.
Be that as it may, the IP Act grants broad general powers. Will the Act foster trust in the system that it sets up? 
The question of trust
David Anderson’s original Review was framed as “A Question of Trust”. Although we may believe a system to be operated by dedicated public servants of goodwill and integrity, nevertheless for the sceptic the answer to the question of trust posed by intrusive state powers is found in a version of the precautionary principle: the price of liberty is eternal vigilance.
Whoever may have coined that phrase, the slavery abolitionist Wendell Phillips in 1852 emphasised that it concerns the people at large as well as institutions:
“Eternal vigilance is the price of liberty; … Only by continued oversight can the democrat in office be prevented from hardening into a despot; only by unintermitted agitation can a people be sufficiently awake to principle not to let liberty be smothered in material prosperity.”
Even those less inclined to scepticism may think that a system of broad, general powers and soft limits merits a less generous presumption of trust than specifically limited, concretely defined powers. 
Either way a heavy burden is placed on oversight bodies to ensure openness and transparency. To quote A Question of Trust: “…trust depends on verification rather than reputation, …”. 
One specific point deserves highlighting: the effectiveness of the 5 year review provided for by the IP Act will depend upon sufficient information about the operation of the Act being available for evaluation.
Hidden legal interpretations
Transparency brings us to the question of hidden legal interpretations. The Act leaves it up to the new oversight body whether or not proactively to seek out and publish material legal interpretations on the basis of which powers are exercised or asserted
That this can be done is evident from the 2014 Report of Sir Mark Waller, the Intelligence Service Commissioner, in which he discusses whether there is a legal basis for thematic property interference warrants. That, however, is a beacon in the darkness. Several controversial legal interpretations were hidden until the aftermath of Snowden forced them into public light. 
David Anderson QC in his post-Act reflections has highlighted this as a “jury is out” point, emphasising that “the government must publicise (or the new Commission must prise out of it)” its internal interpretations of technical or controversial concepts in the new legislation. In A Question of Trust he had recommended that public authorities should consider how they could better inform Parliament and the public about how they interpret powers.
Realistically we cannot safely rely on government to do it. The Act includes a raft of new secrecy provisions behind which legal interpretations of matters such as who applies end to end encryption (the service provider or the user), the meaning of ‘internet communications service’, the dividing line between content and secondary data and other contentious points could remain hidden from public view. It will be interesting to see whether the future Investigatory Powers Commission will make a public commitment to implement the proposal.
Vulnerability to legal challenge
In the result the Act is long on safeguards but short on limits to powers. This structure looks increasingly likely to run into legal problems. 
Take the bulk interception warrant-issuing power. It encompasses a variety of differing techniques. They range from real-time application of 'strong selectors' at the point of interception (akin to multiple simultaneous targeted interception), through to pure ‘target discovery’: pattern analysis and anomaly detection designed to detect suspicious behaviour, perhaps in the future using machine learning and predictive analytics. Between the two ends of the spectrum are seeded analysis techniques, applied to current and historic bulk data, where the starting point for the investigation is an item of information associated with known or suspected wrongdoing.
The Act makes no differentiation between these different techniques. It is framed at an altogether higher level: necessity for general purposes (national security, alone or in conjunction with serious crime or UK economic well-being), proportionality and the like.
Statutory bulk powers could be differentiated and limited. For instance distinctions could be made between seeded and unseeded data mining. If pattern recognition and anomaly detection is valuable for detecting computerised cyber attacks, legislation could specify its use for that purpose and restrict others. Such limitations could prevent it being used for attempting to detect and predict suspicious behaviour in the general population, Minority Report-style. 
The lack of any such differentiation or limitation in relation to specific kinds of bulk technique renders the Act potentially vulnerable to future human rights challenges. Human rights courts are already suggesting that if bulk collection is not inherently repugnant, then at least the powers that enable it must be limited and differentiated.
Thus in Schrems the CJEU (echoing similar comments in Digital Rights Ireland at [57]) said:
“…legislation is not limited to what is strictly necessary where it authorises, on a generalised basis, storage … without any differentiation, limitation or exception being made in the light of the objective pursued.” (emphasis added)
The same principles are elaborated in the CJEU’s recent Watson/Tele2 judgment, criticising mandatory bulk communication data retention:
“It is comprehensive in that it affects all persons using electronic communication services, even though those persons are not, even indirectly, in a situation that is liable to give rise to criminal proceedings. It therefore applies even to persons for whom there is no evidence capable of suggesting that their conduct might have a link, even an indirect or remote one, with serious criminal offences. Further, it does not provide for any exception, and consequently it applies even to persons whose communications are subject, according to rules of national law, to the obligation of professional secrecy ….
106 Such legislation does not require there to be any relationship between the data which must be retained and a threat to public security. In particular, it is not restricted to retention in relation to (i) data pertaining to a particular time period and/or geographical area and/or a group of persons likely to be involved, in one way or another, in a serious crime, or (ii) persons who could, for other reasons, contribute, through their data being retained, to fighting crime …” (emphasis added)
The CJEU is also due to rule on the proposed agreement between the EU and Canada over sharing of Passenger Names Records (PNR data). The particular interest of the PNR case is that the techniques intended to be applied to bulk PNR data are similar to the kind of generalised target discovery techniques that could be applied to bulk data obtained under the IP Act powers. As described by Advocate General Mengozzi in his Opinion of 8 September 2016 this involves cross-checking PNR data with scenarios or profile types of persons at risk:
“… the actual interest of PNR schemes … is specifically to guarantee the bulk transfer of data that will allow the competent authorities to identify, with the assistance of automated processing and scenario tools or predetermined assessment criteria, individuals not known to the law enforcement services who may nonetheless present an ‘interest’ or a risk to public security and who are therefore liable to be subjected subsequently to more thorough individual checks.”
AG Mengozzi recommends that the Agreement must (among other things):
- set out clear and precise categories of data to be collected (and exclude sensitive data)
- include an exhaustive list of offences that would entitled the authorities to process PNR data
- in order to minimise ‘false positives’ generated by automated processing, contain principles and explicit rules:
  • concerning scenarios, predetermined assessment criteria and databases with which PNR would be compared, which must
  • to a large extent make it possible to arrive at results targeting individuals who might be under a reasonable suspicion of participating in terrorism or serious transnational crime, and which must
  • not be based on an individual’s racial or ethnic origin, his political opinions, his religion or philosophical beliefs, his membership of a trade union, his health or his sexual orientation.
As bulk powers come under greater scrutiny it seems likely that questions of limitation and differentiation of powers will come more strongly to the fore. The IP Act’s philosophy of broad powers counterbalanced with safeguards and soft limits may have produced legislation too generalised in scope and reach to pass muster.

Success in getting broad generally framed powers onto the statute book, though it may please the government in the short term, may be storing up future problems in the courts. One wonders whether, in a few years’ time, the government will come to regret not having fashioned a more specifically limited and differentiated set of powers.

[Amended 31 December 2016 to make clear that not all of RIPA is replaced.]

Saturday, 10 December 2016

Investigatory Powers Act 2016 Christmas Quiz

[Updated 1 January 2017 with answers below]

Now that the Investigatory Powers Bill has received Royal Assent, here is a Christmas quiz on the IPAct and its history. 

For some questions the answer is precise, others may be less so. For some the correct answer may be “we don’t know”.

Answers in the New Year.

Q.1 How many new powers does the IPAct introduce: (a) None (b) One (c) Six (d) More than six?

Q.2 Which secret (until revealed in 2015) internal government interpretation of RIPA was described in the House of Commons as ‘a very unorthodox statutory construction’?

Q.3 The subject line of an email is part of its content for interception purposes. True or false?

Q.4 In the IPAct, how is the ban on revealing the contents or existence of a technical capability notice enforced?

Q.5 Under the IPAct, a service provider who wished to challenge a data retention notice in court could not do so because that would break the ban on revealing the notice’s existence or contents. True or false?

Q.6 Under the IPAct a university could be made to install interception capabilities on its internal network. True or false?

Q.7. How is ‘internet communications service’ defined in the IPAct?

Q.8. Who was Amy?

Q.9. Under the IPAct, information intercepted in bulk in order to obtain overseas-related communications needs a specific warrant to be accessed for domestic reasons. True or false?

Q.10. Under the IPAct a university could be made to generate and retain site-level web browsing histories of its academic staff and students. True or false?

Q.11. Can more or fewer bodies access communications data under the IPAct than under RIPA?

Q.12. In 2015 how many people were wrongly accused, arrested or subjected to search warrants as a result of communications data acquisition errors?

Q.13. How much time elapsed between the Home Secretary telling Parliament that the IPBill would not include powers to force UK companies to capture and retain third party internet traffic and this being written into the Bill?

Q.14. In the IPAct, what is the significance of inferred meaning?

Q.15. KARMA POLICE was (and may or may not still be) a GCHQ database of web browsing records revealed by the Edward Snowden documents. According to those documents how much data did it contain, representing what period of time?

Q.16. Which agency has used bulk data to analyse patterns of behaviour from which potential hostile actors could be identified?

Q.17. How many times does ‘proportionate’ appear in the text of the IPAct?

Q.18. For how long before the government publicly acknowledged its use was Section 94 Telecommunications Act 1984 utilised to collect bulk communications data from public electronic communications network providers?

Q.19. Was the government’s use of Section 94 for collecting bulk communications data legal or illegal?

Q.20. How frequently has Section 94 been used for collecting bulk communications data?


Q.1 How many new powers does the IPAct introduce: (a) None (b) One (c) Six (d) More than six?
According the government the Act introduces one new power: retention of internet connection records. Of the four possibilities (b) One is the only answer that cannot be correct.

Internet connection records are a type of communications data. Powers to mandate retention of some kinds of communications data have existed since 2009. On one view, therefore, ICR retention is not a new power but an extension of an existing power. On that basis the correct answer is (a) None.

If, however, a new power includes extension of an existing power then several other extensions should equally be brought into account: retention of non-ICR communications data to include datatypes beyond current powers; retention extended to include generating and obtaining for retention; extension of most powers to include private telecommunications operators; power for agencies to extract some kinds of content from communications and treat it as metadata; extension of power to issue technical capability notices from interception to most other substantive powers. With ICRs that makes a total of (c) Six. You could argue that a more granular breakdown of these extensions yields a total of (d) More than six.

The total is also (d) More than six if we include powers previously exercised on the basis of opaque statutory provisions, such as S.94 of the Telecommunications Act 1984, that gave no indication they might be exercised in this kind of way.

Q.2 Which secret (until revealed in 2015) internal government interpretation of RIPA was described in the House of Commons as ‘a very unorthodox statutory construction’?
The interpretation of "person" so as to enable targeted interception warrants to be issued in respect of groups of persons (so-called thematic warrants) instead of named individuals or specific premises.

The remark was made by Joanna Cherry QC MP in Commons Committee on 12 April 2016:
“The current Home Secretary has apparently derived the authority to do so from a broad definition given to the word “person” that is found elsewhere in RIPA, despite the unequivocal reference to “one person” in section 8(1) of RIPA. I suggest that what has gone on in the past is a very unorthodox statutory construction.”
The existence of thematic warrants and the statutory basis asserted for them was revealed by the Intelligence and Security Services Committee in its report of March 2015:
“The term ‘thematic warrant’ is not one defined in statute. However, the Home Secretary clarified that Section 81(1) of RIPA defines a person as “any organisation or any association or combination of persons”, thereby providing a statutory basis for thematic warrants.”

Q.3 The subject line of an email is part of its content for interception purposes. True or false?
True, under RIPA. Under the IP Act it is more complicated.

The subject line would normally fall within the IP Act’s new definition of ‘content’ (S.261(6)) as an “element of the communication… which reveals anything of what might reasonably be considered to be the meaning (if any) of the communication…”.

However for interception the Act allows so-called ‘secondary data’ to be extracted from the content of a communication and treated as communications data instead of content. Secondary data could include, for instance, the date and time of a meeting set out in the subject line of an e-mail. The Act includes similar provisions for equipment interference.

Q.4 In the IPAct, how is the ban on revealing the contents or existence of a technical capability notice enforced?
A trick question, this one. Most of the IP Act’s secrecy provisions are accompanied by an enforcement mechanism: a criminal offence or injunction. Curiously, however, no enforcement mechanism is prescribed for the S.255(8) prohibition in respect of technical capability notices.

Q.5 Under the IPAct, a service provider who wished to challenge a data retention notice in court could not do so because that would break the ban on revealing the notice’s existence or contents. True or false?
The IPAct does not provide any secrecy exception for this situation. However the non-disclosure duty is enforceable by the Secretary of State’s application to court for an injunction. It is unlikely (to say the least) that a court would allow an injunction application to be used to prevent access to the courts or to frustrate the court’s own proceedings.

The same point arose in Commons Committee debate on 3 May 2016 in relation to technical capability notices. That provision (now S.255(8) – see Q.4) is differently worded in that it expressly allows for the Secretary of State to give permission for disclosure. Keir Starmer QC MP sought reassurance that the provision could not be used to prevent access to the court:
“I have no doubt that, if the Secretary of State exercised her power under clause 218(8) to prevent access to the courts, it would run straight into an article 6 access to courts argument that would succeed on judicial review. I had assumed that one could read into the clause by implication that permission would not be refused in a bona fide and proper case where access to court—or the relevant tribunal, which may be a better way of putting it—was an issue. If that were made clear for the record or by some redrafting of the clause, it would help. As I said, I think that, in practice, any court in this jurisdiction would strike down pretty quickly a Secretary of State who sought to prevent access to the court.”
The Solicitor General responded:
“I think that the hon. and learned Gentleman is right about that. On that basis, I will have another look at clause 218(8), to get it absolutely right. I reassure him that it is not the Government’s intention to preclude access to the court.”
Q.6 Under the IPAct a university could be made to install interception capabilities on its internal network. True or false?
True. However the Act provides a three layer structure for technical capability notices: the statute, regulations made under the statute, then notices issued by the Secretary of State within the regulations. Regulations have yet to be published, but could specify a narrower class of service providers to whom technical capability notices could be issued.

Q.7. How is ‘internet communications service’ defined in the IPAct?
It isn’t. The term underpins two of the conditions that determine when a mandatorily retained internet connection record can be accessed. Footnote 46 in the draft Communications Data Code of Practice is the closest we approach to an indication of what it is intended to cover. The same omission featured in predecessor DRIPA regulations.

Q.8. Who was Amy?
Amy was a fictitious “quiet, impressionable 14 year old schoolgirl” who featured in a series of National Crime Agency infographics supporting the case for retaining communications data and internet connection records.

Q.9. Under the IPAct, information intercepted in bulk in order to obtain overseas-related communications needs a specific warrant to be accessed for domestic reasons. True or false?
False. Although a targeted examination warrant is required in order for content to be selected for examination by reference to someone known to be within the British Islands at the time of the selection, that does not apply to non-content ‘secondary data’ (which itself can include some data extracted from content – see Q.3). 

Q.10. Under the IPAct a university could be made to generate and retain site-level web browsing histories of its academic staff and students. True or false?
True. A communications data retention notice can be issued against a public or a private telecommunications operator. A university operating its own network is a telecommunications operator. Communications data can include internet connection records, including site level browsing histories.

The draft Communications Data Code of Practice sets out factors that will be taken into account in deciding which operators in practice will receive notices.

Q.11. Can more or fewer bodies access communications data under the IPAct than under RIPA?
A like for like count is not easy, due to differences in nomenclature and organisation. The overall count appears to be more or less the same.

A cull of authorities entitled to acquire communications data under RIPA was carried out in February 2015, when 13 authorities had their powers removed. One of those removed, the Food Standards Agency, is reinstated under the IP Act together with its Scottish counterpart Food Standards Scotland. The Prudential Regulation Authority will no longer be able to acquire communications data under the IP Act.

A detailed comparison of existing and proposed powers (other than for the police and intelligence services) is contained in the government’s “Operational case for the use of communications data by public authorities” (July 2016).

Q.12. In 2015 how many people were wrongly accused, arrested or subjected to search warrants as a result of communications data acquisition errors?

Q.13. How much time elapsed between the Home Secretary telling Parliament that the IPBill would not include powers to force UK companies to capture and retain third party internet traffic and this being written into the Bill?
11½ months (4 November 2015 to 19 October 2016).

Q.14. In the IPAct, what is the significance of inferred meaning?
The term does not appear in the statute itself. However the Draft Codes of Practice explain how this is an important concept in understanding the distinction between content and communications data.

Q.15. KARMA POLICE was (and may or may not still be) a GCHQ database of web browsing records revealed by the Edward Snowden documents. According to those documents how much data did it contain, representing what period of time?
17.8 billion rows, representing 3 months of data.

Q.16. Which agency has used bulk data to analyse patterns of behaviour from which potential hostile actors could be identified? 
MI6, according to example A11/2 annexed to the Bulk Powers Review.

Q.17. How many times does ‘proportionate’ appear in the text of the IPAct?
62 (compared with 48 in the draft Bill).

Q.18. For how long before the government publicly acknowledged its use was Section 94 Telecommunications Act 1984 utilised to collect bulk communications data from public electronic communications network providers?
About 12 years.

Q.19. Was the government’s use of Section 94 for collecting bulk communications data legal or illegal?
The Investigatory Powers Tribunal held that the use was within the scope of the S.94 power. However before November 2015 it infringed Article 8 of the European Convention on Human Rights because it was not foreseeable that S.94 would be used in that way and also through lack of an adequate system of oversight for most of that period.

Q.20. How frequently has Section 94 been used for collecting bulk communications data?
The Interception of Communications Commissioner’s July 2016 Review of Section 94 directions identified 15 extant bulk communications data directions under S.94. All those directions were for traffic data and required “regular feeds”.