Monday 29 May 2017

Squaring the circle of end to end encryption

Eager student: Encryption seems to be back in the news. Why has this come up again?
Scholarly lawyer: It never really went away. Ever since David Cameron sounded off about encryption before meeting Barack Obama in January 2015 it’s been bubbling under.
ES: What did David Cameron say about it?
SL: He said: “In extremis, it has been possible to read someone’s letter, to listen to someone’s call, to listen in on mobile communications, ... The question remains: Are we going to allow a means of communications where it simply is not possible to do that? My answer to that question is: no, we must not.” That sounded very much as if he wanted some kind of encryption ban.
ES: Didn’t Downing Street row back on that?
SL: At the end of June 2015 David Cameron said something very similar in Parliament. Downing Street followed up with: “The prime minister did not suggest encryption should be banned." They said much the same to the BBC in July 2015.
ES: Now the focus seems to be specifically on end to end encryption.
SL: Yes. Amber Rudd said in March this year that E2E encryption was “completely unacceptable”. Downing Street weighed in again: “What the home secretary said yesterday is: where there are instances where law-enforcement agencies wish to gain access to messages which are important to an investigation, they should be able to do so.”
ES: Which brings us to this weekend?
SL: Yes. Amber Rudd has disclaimed any intention to ban end-to-end encryption completely, but at the same time she appears to want providers of E2E encrypted messaging services to provide a way of getting access.
ES: So where does that leave us?
SL: The government evidently wants to do something with end to end encryption. But exactly what is unclear.
ES: Can we ask them to make it clear?
SL: Many have tried. All have failed. That isn’t really surprising, since the very nature of end to end encryption is that the messaging provider has no way of decrypting it.
ES: So if the messaging provider does have a way in, it’s no longer true end to end encryption?
SL: Exactly.
ES: But hasn't end to end encryption been around for years?
SL: In the form of standalone software like PGP, yes. In fact that is what sparked the First Crypto War in the 1990s.
ES: Which ended up with universally available public key encryption?
SL: Exactly. The encryption genie couldn’t be put back in the bottle – you can write a public key encryption algorithm on a T-shirt - and they stopped fighting it.
ES: So what has changed now?
SL: Apps and the cloud. Software such as PGP is an add-on, like anti-virus software.  I make the decision to get PGP from somewhere and to use it with my e-mail. It has nothing to do with my e-mail provider.  But now messaging service providers are incorporating E2E encryption as part of their service.
ES: What difference does that make?
SL: Commercially, the provider will be seen as part of the loop and so as a target for regulatory action. Technically, if the communications touch the provider’s servers someone might think that the provider should be able to access them in response to a warrant.
ES: PGP-encrypted e-mails are also stored in the e-mail provider’s servers, but the provider can't decrypt those.
SL: Certainly. But if the messaging service provider itself provides the ability for me to encrypt my messages as part of its service, then it could be said that it has more involvement. It may store some information on its servers, for instance so that I can set up a connection with an offline user.
ES: If the provider does all that, why can’t it decrypt my messages?
SL: Because I and my counterparty user are generating and applying the encryption keys. With full end to end encryption the service provider never possesses or sees the private key that my app uses to encrypt and decrypt messages.
ES: But that’s the case only for full end to end encryption, right?
SL: Yes, there are other encryption models where the service provider has a key that it could use to decrypt the message.
ES: If it never sees the key and cannot decrypt your message, isn’t the service provider in the same position with end to end encryption as with original PGP? What can the service provider be made to do if it doesn’t have a key?
SL: Now we need to delve into the UK’s interception legislation. Buckle your seatbelt.
ES: Ready.
SL: As you know the new Investigatory Powers Act 2016, like the existing Regulation of Investigatory Powers Act 2000, includes power to serve an interception warrant on a telecommunications operator.
ES: Would that include a messaging provider?
SL: Yes. It shouldn’t include someone who merely supplies encryption software like PGP, but a messaging service provider would be in the frame to have a warrant served on it.
ES: What can a messaging provider be made to do?
SL: It could be required to assist with the implementation of the warrant. If it does have a key, then it could assist by using its key to decrypt any intercepted messages.
ES: Is that a new requirement under the IPAct?
SL: No, RIPA is the same. And even if the provider handed over only an encrypted message, a separate RIPA power could be deployed to make it use its key to decrypt the message.
ES: And if the telecommunications operator doesn’t have a key? How can it assist with the interception warrant?
SL: All it can do is hand over the encrypted message. Both RIPA and the new IPAct say that the telecommunications operator can be required to do only what is reasonably practicable in response to a warrant. If it has no key it cannot be made to do more.
ES: Is that it?
SL: No, the government has one more card, which might be a trump.  Under both the new IP Act and existing RIPA the Minister can serve a notice (a 'technical capability notice', or TCN) on a telecommunications operator requiring it to install a permanent interception capability. This can include the capability to remove any electronic protection applied ‘by or on behalf’ of the telecommunications operator.
ES: Does ‘electronic protection’ include encryption?
SL: Yes. But pay attention to ‘applied by or behalf of’. If the encryption is applied by the user, not the telecommunications operator, then a TCN cannot require the telecommunications operator to remove it.
ES: So a lot could turn on whether, in the particular system used by the operator, the encryption is regarded as being applied by or on behalf of the operator?
SL: Yes. If so, then the TCN can require the operator to have the capability to remove it.
ES: But if the operator doesn’t have a key, how can that be reasonably practicable?
SL: For an operator subject to a TCN who is served with a warrant, reasonable practicability assumes that it has the capability required by the TCN.
ES: So the operator is deemed to be able to do the impossible. How do we square that circle?
SL: A Secretary of State considering whether to issue a TCN has to take into account technical feasibility. Clearly it is not technically feasible for an operator who provides its users with true end-to-end encryption facilities to have a capability to remove the encryption, since it has no decryption key. That might mean that a TCN could not require an operator to do that.
ES: But what if the Secretary of State were to argue that it was technically feasible for the operator to adopt a different encryption model in which it had a key?
SL: Good point.  If that argument held up then the service provider would presumably have to stop offering true end to end encryption facilities in order to comply with a TCN.
ES: Could a TCN be used in that way, to make a telecommunications operator provide a different kind of encryption? Wouldn't that be tantamount to making it provide a different service?
ES: How would we know whether the Secretary of State was trying to do this?
SL: That’s difficult, because a telecommunications operator is required to keep a TCN secret. One possibility is that the new Investigatory Powers Commissioner may proactively seek out controversial interpretations of the legislation that have been asserted and make them public.
ES: Is there a precedent for that?
SL: Yes, the Intelligence Services Commissioner Sir Mark Waller in his 2014 Report discussed whether there was a legal basis for thematic property interference warrants. David Anderson QC’s Bulk Powers Review has supported the idea that the Investigatory Powers Commissioner should do this.
ES: So what happens next?
SL: Draft TCN regulations have recently been consulted on and presumably will be laid before Parliament at some point after the election.  If those are approved, then the ground will have been prepared to approve and serve new TCNs once the IPAct comes into force, which will most likely be later this year.
ES: Thank you.

Sunday 21 May 2017

Time to speak up for Article 15

Article 15 of the ECommerce Directive lays down the basic principle that EU Member States cannot impose a general obligation on internet intermediaries to monitor what people say online. We in the UK may have to start worrying for Article 15. It could easily be overlooked, or even deliberately left behind, when we start the process of converting EU to domestic UK law in preparation for leaving the EU. 

Article 15 is a strong candidate for the most significant piece of internet law in the UK and continental Europe. It is the stent that keeps the arteries of the internet open. It prevents the state from turning internet gateways into checkpoints at which the flow of information could be filtered, controlled and blocked.

The principle embodied in Article 15 is currently under pressure: from policymakers within and outside Brussels, from antagonistic business sectors, from the security establishment and potentially from all manner of speech prohibitionists. The common theme is that online intermediaries – ISPs, telecommunications operators, social media platforms - are gatekeepers who can and should be pressed into active service of the protagonists’ various causes.

Article 15 stands in the way of the blunt instrument of compulsory general monitoring and filtering. It does so not for the benefit of commercial platforms and ISPs, but to fulfil the policy aim of protecting the free flow of information and ultimately the freedom of speech of internet users.

Freedom of expression is not just any old policy aim, but a universal value at the heart of human rights – whether we look at Article 19 of the Universal Declaration of Human Rights, Article 10 of the European Convention, Article 11 of the EU Charter, the US First Amendment or the unwritten British historical attachment to freedom of the press. It is particularly precious because, for better or worse, speech reflects our very selves. “Give me the liberty to know, to utter, and to argue freely according to conscience, above all liberties.” (John Milton)

Conversely, freedom of expression has always been threatened by governments whose first instinct is to control. That is one reason why, perhaps more so than for any other human right, defenders of free speech find themselves taking principled stands on the most unattractive ground. “Because if you don't stand up for the stuff you don't like, when they come for the stuff you do like, you've already lost.” (Neil Gaiman)

The peculiar vice of compelled general monitoring, however, is that we never get to that point. If the filtered and blocked speech doesn’t see the light of day it never gets to be debated, prosecuted, tested, criticised or defended. To some, that may be a virtue not a vice.

Where freedom of speech is concerned, if principle is allowed to take second place to the exigencies of the moment we find ourselves not so much on a slippery slope as in a headlong rush down the Cresta Run. So it is with Article 15. The queue of noble causes on whose behalf we are urged to compel gateways to be gatekeepers - countering copyright infringement, trolling, hate speech, terrorism, pornography, fake news and the rest - stretches round the block.

We defend the right to bad speech for the sake of the good. We understand the impossibility of drawing a bright line between bad and good speech. We regulate bad speech only at the peril of the good. The peril is greater when the regulatory implement of choice is a tool as blunt as general monitoring.

Article 15 lays down a principle that applies across the board, from copyright to terrorism. EU Member States must not impose on internet intermediaries (conduits, hosts and network caches) a general obligation to monitor or actively to seek facts or circumstances indicating illegal activity. Intermediaries cannot be made to snuffle around their systems looking for unlawful activities. Article 15 goes hand in hand with the Directive’s liability shields under which conduits, hosts and network caches have various degrees of protection from criminal and civil liability for the activities of their users.

It is only too easy for policymakers to point the finger at intermediaries and demand that they do more to control the unpleasant and sometimes illegal things that people do on their systems. Policymakers see intermediaries as points of least cost enforcement: it is more efficient to enforce at a chokepoint than to chase tens of thousands of individual wrongdoers. The theory is made explicit in Recital (59) of the EU Copyright in the Information Society Directive:

“In the digital environment, in particular, the services of intermediaries may increasingly be used by third parties for infringing activities. In many cases such intermediaries are best placed to bring such infringing activities to an end.”
Mr Justice Arnold in Cartier explained the policy that underlies Recital (59):
“As can be seen from recital (59) to the Information Society Directive, the economic logic of granting injunctions against intermediaries such as ISPs is that they are the "lowest cost avoiders" of infringement. That is to say, it is economically more efficient to require intermediaries to take action to prevent infringement occurring via their services than it is to require rightholders to take action directly against infringers. Whether that is correct as a matter of economics is not for me to judge. Nor is it for me to judge whether it is good policy in other ways. That judgement has already been made by the legislators …”
At the same time, Article 15 of the ECommerce Directive constrains the breadth of injunctions that courts can grant against intermediaries under the Copyright and Enforcement Directives. The effect of Article 15 can be seen in the ECJ decisions of SABAM v Scarlet and SABAM v Netlog prohibiting content filtering injunctions, and in Arnold J’s Cartier judgment itself:
“If ISPs could be required to block websites without having actual knowledge of infringing activity, that would be tantamount to a general obligation to monitor.”
But if intermediaries are best placed to stop infringement, why should Article 15 constrain what can be imposed on them? Why shouldn’t the intermediaries be required to monitor?

The only sense in which intermediaries could be seen as best placed is that, since users’ communications flow through their systems, they have the potential to be technical chokepoints. In every other respect intermediaries are poorly placed to make decisions on legality of content and thus on what to block.

Intermediary enforcement risks exaggerating the ease with which unlawful behaviour can be identified, often assuming that illegal content is identifiable simply by looking at it. In relatively few categories is illegality manifest. Legality is predominantly a matter of factual investigation and judgement. That is why it is preferable to have independent courts ruling on matters of illegality rather than compelling private platforms to attempt it and have them overblock out of fear of liability or sanctions.

A too narrowly focused cost analysis tends to underplay or even ignore the negative externalities and unintended consequences of compelling gateways to act as gatekeepers. It excludes any broader implications of reinforcing chokepoints, the creation of a climate in which playing gatekeeper on behalf of the state and its proxies becomes the norm. In a broader context the least cost enforcer may turn out to be highest cost.

Notice-based intermediary liability systems result in material being removed before a court determines whether it is unlawful. That already carries a risk of overcautious blocking or removal. Compelled proactive monitoring and filtering, since it blocks information about which no complaint has been made, moves the scale of risk to another level. It is akin to prior restraint on a grand scale, conducted not by courts after hearing evidence but by private entities made to act as investigator, prosecutor, judge, jury and executioner.

Our aversion to prior restraint reflects also that the public are sometimes well served by the airing of something that at first blush might appear to be against the strict letter of the law. Speech may be rendered lawful by a public interest defence, or by fundamental freedom of speech considerations. Or a court might decide that even though unlawful the appropriate remedy is damages but not removal. Legality of speech, even in areas such as copyright, can be a heavily nuanced matter. Proactive general monitoring obligations allow for no such subtlety.

Some may argue that in modern times the quid pro quo for living with freedom of speech has been that speech is generally mediated through professional, responsible editors. And that we need to put that genie back in the bottle by converting online intermediaries into editors and publishers, responsible for what other people say on their platforms.

Never mind whether that could be achieved, the argument misunderstands the nature of freedom of expression. The great advance of the internet has been to bring about something akin to the golden age of pamphleteering, freeing mass individual speech from the grip of the mass media. District Judge Dalzell was right when, in ACLU v Reno, he said:

“As the most participatory form of mass speech yet developed, the internet deserves the highest protection from governmental intrusion.”
The US Supreme Court in the same case said:
“Through the use of chat rooms, any person with a phone line can become a town crier with a voice that resonates farther than it could from any soapbox. Through the use of Web pages, mail exploders, and newsgroups, the same individual can become a pamphleteer.”
Those quotations were from 1996 and 1997. They are, if anything, more relevant now. Individual, unmediated speech deserves more, not less, protection than the traditional press.

It may be discomfiting that the kind of vitriol that used to be confined to Speaker's Corner can now reach an audience of millions. But freedom of individual speech was never something only to be tolerated as a tourist curiosity, or indulged as long as it was hidden away in a pub saloon bar. Nor, as we know from the ECtHR decision in Handyside, is freedom of expression confined to that which would not offend in a genteel drawing room.

Article 19 of the 1948 Universal Convention on Human Rights is not predicated on the assumption of mediated speech. It articulates an individual, personal right that transcends place, time and medium and could have been written with the internet in mind:

“Everyone has the right to freedom of opinion and expression; this right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media and regardless of frontiers.”
Article 15 stands squarely in the path of compelling mediated speech through the instrument of general monitoring. So why might it be vulnerable to being overlooked in the Brexit process?

Caveat: what follows is based on the existing Conservative government’s plans for a Great Repeal Bill and is subject to the outcome of the General Election.

Article 15 and Brexit

[Note: this section is now overtaken by the European Union (Withdrawal) Act 2018. Article 15 is probably saved by virtue of Section 4 of the Act.] 


Article 15 operates at several levels. If a Member State were to legislate in breach of Article 15, a national court would be obliged to disapply the legislation. So it acts as a powerful constraint on Member States at the policy and legislative level. As we have seen it also constrains Member States’ courts. They cannot issue an order that would impose a general monitoring obligation. They cannot interpret a domestic statute or develop a common law obligation in a way that is contrary to Article 15. Regulatory and enforcement bodies are similarly constrained.

The Brexit starting point is that the incumbent government has committed to continuing existing EU law through the Great Repeal Bill. Theresa May says in the introduction to the Great Repeal Bill White Paper:

“Our decision to convert the ‘acquis’ – the body of European legislation – into UK law at the moment we repeal the European Communities Act is an essential part of this plan.

This approach will provide maximum certainty as we leave the EU. The same rules and laws will apply on the day after exit as on the day before. It will then be for democratically elected representatives in the UK to decide on any changes to that law, after full scrutiny and proper debate.

… Th[e Great Repeal] Bill will, wherever practical and appropriate, convert EU law into UK law from the day we leave so that we can make the right decisions in the national interest at a time that we choose.”
On that basis Article 15 ought to be continued post-Brexit. However there is a technical problem. Although it is in a Directive, and so was required to be implemented in UK law, the text of Article 15 appears nowhere in UK domestic legislation. Depending on how the proposed Great Repeal Bill is drafted, Article 15 may have to be specifically written in to UK legislation in order to continue post-Brexit.

The White Paper recognises the need to write EU Regulations into domestic law, but appears to assume that since a Directive will already have been implemented in UK domestic law it just needs to be preserved post-Brexit:

“• the Bill will convert directly-applicable EU law (EU regulations) into UK law;

• it will preserve all the laws we have made in the UK to implement our EU obligations”

Article 15 could run the risk of falling between the cracks.

In any event the desirability of continuing Article 15 may not be universally accepted. UK Music, in its ‘Music 2017 Manifesto’, has noted the opportunity that Brexit presents to ‘place responsibility on internet service providers and require them to have a duty of care for copyright protected music’. If that implies proactive monitoring it would put Article 15 in question. Where one industry leads others may follow. A government interested for its own purposes in turning the screw on intermediaries might not welcome the impediment of Article 15. It might be tempted to invoke the ‘wherever practicable and appropriate’ White Paper qualification on continuation of existing EU law.

“Freedom of expression is not self-perpetuating, but rather has to be maintained through the constant vigilance of those who care about it.” So said Index on Censorship in 1972. The run-up to Brexit may be a time for especial vigilance.


[Amended 7 April 2019 to include reference to the European Union (Withdrawal) Act 2018.] 





Monday 8 May 2017

Back doors, black boxes and #IPAct technical capability regulations

The Home Office has launched an under-the-radar consultation on a critical step in the implementation of the Investigatory Powers Act (IPAct): the regulations on technical capability notices. The Open Rights Group has recently revealed details of the proposed regulations.

Under the IPAct a technical capability notice can be issued to a telecommunications operator by the Secretary of State, with the approval of a Judicial Commissioner. A notice would require the operator to install specified technical facilities. The objective is to ensure that if the operator subsequently receives, say, an interception warrant it has the technical ability to comply with it. A technical capability notice does not itself require an operator to conduct an interception. It prepares the ground in advance by ensuring the operator has equipment in place.

The proposed regulations will spell out what kind of facilities a technical capability notice can require a telecommunications operator to install. For example, the consultation touches on one of the many controversial topics in the IPAct: the possible use of technical capability notices in effect to prevent telecommunications operators from providing users with end to end encryption facilities.

Telecommunications operators are widely defined in the IPAct to include not only telcos, ISPs and the like but also web e-mail, social media platforms, cloud hosts and over the top communications providers.

Technical capability notices already exist, but in a much more limited form, under the Regulation of Investigatory Powers Act 2000 (RIPA). S.12 of RIPA enacted a three layer scheme similar to that under the new IPAct:

  • first the statute, laying out in broad terms the Home Office’s powers to require an operator to install an interception capability;
  • second, regulations made under the Act. These put more flesh on the obligations and potentially narrow the categories of provider who could be made subject to a notice;
  • third, technical capability notices themselves, issued by the Secretary of State to individual service providers (but not necessarily to all of those within scope of the Act or the regulations).
These pave the way for actual interception warrants, requiring operators to carry out particular interceptions.

The main change with the IPAct is that technical capability notices are no longer limited to interception. They apply to three of the powers under the Act: interception (targeted, thematic and bulk), communications data acquisition (ordinary and bulk) and equipment interference (targeted, thematic and bulk).

Another high level change is that the IPAct allows technical capability notices to be given to private as well as to public telecommunications providers. The draft regulations reflect this expansion.

Also, unlike under RIPA, IPAct technical capability notices have to be approved by a Judicial Commissioner.

The proposed IPAct regulations are in many respects similar to the existing 2002 regulations made under RIPA. However there are some significant differences.

Communications data acquisition capability not subject to 10,000 person threshold

The existing RIPA interception capability regulations set a 10,000 person threshold below which an interception capability cannot be required. (It has never been very clear whether this referred to customers or end-users.) The proposed new regulations repeat this threshold for interception and equipment interference, albeit removing the existing limitation that the 10,000 persons be within the UK.

For communications data acquisition, however, the new draft IPAct regulations set no minimum threshold. Combine this with the IPAct’s enlarged scope, covering private and public telecommunications operators, and we have the startling prospect that any kind of organisation, business (other than excluded financial services businesses), institution, university, school, hospital, library, political party and so on could potentially be required to install a communications data acquisition capability. In theory this could even apply to private households, although it is difficult to imagine this ever being thought appropriate.

Communications data acquisition ‘black box’

The communications data acquisition aspects of the draft regulations differ from interception and equipment interference in another significant respect. The existing RIPA interception regulations are framed as obligations on operators to provide the capability themselves. The same is true of the new IPAct interception and equipment interference obligations. This approach allows operators to design or procure their own interception equipment, so long as it complies with the technical capability notice. 

The new IPAct communications data requirements, however, include a paragraph under which a technical capability notice could require a provider to install a government-provided ‘black box’:

“10. To install and maintain any apparatus provided to the operator by or on behalf of the Secretary of State for the purpose of enabling the operator to obtain or disclose communications data, including by providing and maintaining any apparatus, systems or other facilities or services necessary to install and maintain any apparatus so provided.”
This paragraph, unheralded during the Bill’s passage though Parliament, applies to both ordinary and bulk communications data acquisition capabilities. It is a substantial departure in kind from previous RIPA obligations.

New services

Unsurprisingly, since this was heavily trailed during the passage of the Bill, all three sets of provisions allow the imposition of obligations to notify the Home Office in advance of new and changed services. A technical capability notice would also be able to require the operator to “consider” the obligations and requirements imposed by any technical capability notice when designing or developing new telecommunications services or telecommunications systems.

The 2002 regulations contained no obligations of this kind.

End to end encryption

The most controversial aspect of technical capability notices throughout the passage of the Bill was whether the obligation to remove encryption could be used to prevent use of end to end encryption. On this topic the IP Act and the draft regulations in fact mirror quite closely an obligation that was always in the existing 2002 RIPA regulations:

“10. To ensure that the person on whose application the interception warrant was issued is able to remove any electronic protection applied by the service provider to the intercepted communication and the related communications data.”
The proposed IP Act regulations say (for interception):
“8. To provide and maintain the capability to disclose, where practicable, the content of communications or secondary data in an intelligible form and to remove electronic protection applied by or on behalf of the telecommunications operator to the communications or data, or to permit the person to whom the warrant is addressed to remove such electronic protection.”
However while standalone end to end encryption software existed in 2002 it would not have been touched by the 2002 regulations since the encryption was not applied by a communications service provider. Only comparatively recently have communications service providers offered their customers the ability to use end to end encryption, where the service provider does not have and never has had an encryption key. 

This development has given rise to questions about whether a technical capability notice under the IP Act could be used to require a telecommunications operator to have a means of decrypting messages, effectively preventing it from providing end to end encryption facilities to its customers.

In Parliament the issue surfaced repeatedly during the passage of the Bill, culminating in a House of Lords debate on 19 October 2016 in which Home Office Minister Earl Howe was subjected to tenacious questioning from Lord Harris of Haringey.

The question of whether technical capability notices could be used in this way has never been satisfactorily resolved. The Home Office has repeatedly (and correctly) emphasised that the obligation can only apply to encryption ‘applied by or on behalf of’ the service provider. But it has never clarified when encryption would be regarded as applied by the provider and when by the user. Perhaps the closest it came was in the House of Lords debate when Earl Howe said:

“Any decision will have regard to the particular circumstances of the case, recognising that there are many different models of encryption, including many different models of end-to-end encryption, and that what is reasonably practicable for one telecommunications operator may not be for another.”
In that passage and elsewhere the Home Office has stressed that a service provider cannot be made to do anything that is not ‘reasonably practicable’. Thus Earl Howe, again in the House of Lords debate, said:
“… the company on whom the warrant is served will not be required to take any steps, such as the removal of encryption, if they are not reasonably practicable steps for that company to take. So a technical capability notice could not, in itself, authorise an interference with privacy. It would simply require a capability to be maintained that would allow a telecommunications operator to give effect to a warrant quickly and securely including, where applicable, the ability to remove encryption.”
He added:
“These safeguards ensure that an obligation to remove encryption under Clause 229 of the Bill will be subject to very strict controls and may be imposed only where it is necessary and proportionate, technically feasible and reasonably practicable for the relevant operator to comply.”
Later on he said:
“The Bill ensures that the Secretary of State must specifically consider the cost and technical feasibility of complying with an obligation to remove encryption as well as whether it is reasonably practicable.”
However it is important not to conflate the technical capability notice and a subsequent warrant. The raison d’etre of a technical capability notice is to achieve a situation in which it is practicable for a service provider to assist with a warrant (see IPAct s. 253(4)).  The obligations in the draft regulations are those that the Secretary of State considers reasonable to impose for that purpose.  When issuing a technical capability notice the Secretary of State has to consider, among other things, technical feasibility and cost.

The Act does provide that a warrant cannot require a service provider to do something that is not reasonably practicable. But a warrant is not a technical capability notice. Crucially, the Act lays down that where a technical capability notice is in place, reasonable practicability of assisting with a warrant is to be judged on the assumption that the technical capability notice has been complied with.

Thus for ordinary (non-bulk) interception S. 43(4) and (6) provide:

“(4) The relevant operator is not required to take any steps which it is not reasonably practicable for the relevant operator to take.” 
“(6) Where obligations have been imposed on a relevant operator (“P”) under section 253 (technical capability notices), for the purposes of subsection (4) the steps which it is reasonably practicable for P to take include every step which it would have been reasonably practicable for P to take if P had complied with all of those obligations.” 
For a technical capability notice the central concept is technical feasibility.

Clearly it is not technically feasible for an operator who provides its users with true end-to-end encryption facilities to remove the encryption, since it has no decryption key.

But what if the Home Office were to argue that it was technically feasible for the operator to adopt a different encryption model under which it had a key? If that argument held up then the service provider would (subject to the ‘applied by or on behalf of’ point) have to stop offering true end to end encryption facilities in order to comply with a notice. If it did not cease, then if it received a warrant it would be of no avail to say that it was not reasonably practicable to remove the encryption, since the Act would deem it to have complied with the technical capability notice.

Whether a technical capability notice could be used to require a provider to change the nature of a service that it was offering in this way is one of the great imponderables of this part of the legislation. The draft regulations shed no more light on the matter.

This is an area in which the interpretation that the Home Office places on the Act and the final regulations could be critical. The new oversight body could have an important role in proactively seeking out such interpretations and bringing them to public notice.

Equipment interference

A major change compared with the 2002 regulations is the extension of technical capability notices beyond the existing area of interception. The proposed regulations cover, as well as communications data acquisition already discussed, equipment interference aimed at obtaining communications, equipment data and other information. This is no surprise, since that is one of the changes introduced by the IPAct itself.

Nevertheless the idea that a telecommunications operator can be compelled to put in place technical facilities specifically to enable authorities to hack any equipment under a warrant remains surprising. This equipment interference obligation, perhaps more so than removal of encryption, deserves the epithet ‘back door’.

Notably, given the security concerns that would no doubt accompany the provision of a hacking gateway for the authorities, as with interception and communications data acquisition the draft regulations provide that an equipment interference capability notice can include a requirement to comply with security standards specified in the notice and any guidance issued by the Secretary of State. 

Under S.2(2)(c) of the IPAct the Secretary of State has a duty to have regard to the public interest in the integrity and security of telecommunication systems.

Consultation process

Under S.253(6) of the IPAct the Home Secretary must consult on the draft regulations. She is required to consult the Technical Advisory Board set up under the Act, operators 'appearing likely to the Secretary of State to be likely to be subject to any obligations specified in the regulations' and their representatives, and persons with relevant statutory functions (an example would presumably be the new Investigatory Powers Commissioner).

Notably absent from the must-consult list are the general public (who most of all stand to be affected by the Act) or any organisations representing the public in areas such as privacy and civil liberties. However, now that the proposed regulations have reached a wider audience than the must-consult list, more broadly based comment can be expected.

One point of interest is how far the Home Office’s statutory ‘must-consult’ obligation reaches. This is especially pertinent when, as already highlighted, the part of the draft regulations that deals with acquisition of communications data does not contain a 10,000 person minimum threshold.

So unlike for equipment interference and interception, which do specify a minimum 10,000 person limit, the communications data acquisition capability provisions (including the ability to require installation of a government-supplied 'black box') can be applied however few customers or users an operator may have. Moreover the obligations are not restricted to public operators. Private telecommunications operators can be included. As we have seen, thanks to the Act's wide definition of telecommunications operator that could cover many kinds of organisations.

This may create a conundrum. If it does not appear to Home Secretary that private or small operators are likely to be subject to any obligations specified in the regulations, then she does not have to consult them or their representatives. But in that event, what would be the purpose of extending the scope of the regulations, specifically for communications data acquisition, to include all operators large or small, private or public and apparently including organisations outside the traditional telco and ISP sectors? That could affect the scope of the consultation that the Secretary of State is obliged to undertake.