Monday, 30 April 2018

The Electronic Commerce Directive – a phantom demon?

Right now the ECommerce Directive – or at any rate the parts that shield hosting intermediaries from liability for users’ content - is under siege. The guns are blazing from all directions: The Prime Minister’s speech in Davos, Culture Secretary Matt Hancock’s speech at the Oxford Media Convention on 12 March 2018 and the European Commission’s Recommendation on Tackling Illegal Content Online all take aim at the shield, or at its linked bar on imposing general monitoring obligations on conduits, caches and hosts. The proposed EU Copyright Directive is attacking from the flanks.

The ECommerce Directive is, of course, part of EU law. As such the UK could, depending on what form Brexit takes, diverge from it post-Brexit. The UK government has identified the Directive as a possible divergence area and Matt Hancock's Department for Digital, Culture, Media and Sport (DCMS) is looking at hosting liability.
The status quo

Against this background it is worth looking behind the polarised rhetoric that characterises this topic and, before we decide whether to take a wrecking ball to the Directive's liability provisions, take a moment to understand how they work.  As so often with internet law, the devil revealed by the detail is a somewhat different beast from that portrayed in the sermons.
We can already sense something of that disparity. In her Davos speech Theresa May said:
“As governments, it is also right that we look at the legal liability that social media companies have for the content shared on their sites. The status quo is increasingly unsustainable as it becomes clear these platforms are no longer just passive hosts.”
If this was intended to question existing platform liability protections, it was a curious remark. Following the CJEU decisions in LVMH v Google France and L’Oreal v eBay, if a hosting platform treats user content non-neutrally it will not have liability protection for that content. By non-neutrally the CJEU means that the operator "plays an active role of such a kind as to give it knowledge of, or control over, those data".

So the status quo is that if a platform does not act neutrally as a passive host it is potentially exposed to legal liability.
By questioning the status quo did the Prime Minister mean to advocate greater protection for platforms who act non-neutrally than currently exists? In the febrile atmosphere that currently surrounds social media platforms that seems unlikely, but it could be the literal reading of her remarks. If not, is it possible that the government is taking aim at a phantom?
Matt Hancock's speech on 12 March added some detail:

"We are looking at the legal liability that social media companies have for the content shared on their sites. Because it’s a fact on the web that online platforms are no longer just passive hosts.
But this is not simply about applying publisher or broadcaster standards of liability to online platforms.
There are those who argue that every word on every platform should be the full legal responsibility of the platform. But then how could anyone ever let me post anything, even though I’m an extremely responsible adult?
This is new ground and we are exploring a range of ideas… including where we can tighten current rules to tackle illegal content online… and where platforms should still qualify for ‘host’ category protections."
It is debatable whether this is really new ground when these issues have been explored since the advent of bulletin boards and then the internet. Nevertheless there can be no doubt that the rise of social media platforms has sparked off a new round of debate.
  
Sectors, platforms and activities

The activities of platforms are often approached as if they constitute a homogenous whole: the platform overall is either a passive host or it is not. Baroness Kidron, opening the House of Lords social media debate on 11 January 2018, went further, drawing an industry sector contrast between media companies and tech businesses:
“Amazon has set up a movie studio. Facebook has earmarked $1 billion to commission original content this year. YouTube has fully equipped studios in eight countries."
She went on:  

"The Twitter Moments strand exists to “organize and present compelling content”. Apple reviews every app submitted to its store, “based on a set of technical, content, and design criteria”. By any other frame of reference, this commissioning, editing and curating is for broadcasting or publishing.”
However the ECommerce Directive does not operate at a business sector level, nor at the level of a platform treated as a whole. It operates at the level of specific activities and items of content. If an online host starts to produce its own content like a media company, then it will not have the protection of the Directive for that activity. Nor will it have protection for user content that it selects and promotes so as to have control over it.  Conversely if a media or creative company starts to host user-generated content and treats it neutrally, it will have hosting protection for that activity.  

In this way the Directive adapts to changes in behaviour and operates across business models. It is technology-neutral and business sector-agnostic. A creative company that develops an online game or virtual world will have hosting protection for what users communicate to each other in-world and for what they make using the tools provided to them.
The line that the Directive draws is not between media and tech businesses, nor between simple and complex platforms, but at the fine-grained level of individual items of content. The question is always whether the host has intervened at the level of a particular item of content to the extent that (in the words of one academic)[1], it might be understood to be their own. If it does that, then the platform will not have hosting protection for that item of content. It will still have protection for other items of user-generated content in relation to which it has remained neutral.  The scheme of the Directive is illustrated in this flowchart.


The analysis can be illustrated by an app such as one that an MP might provide for the use of constituents. Videos made by the MP would be his or her own content, not protected by the hosting provisions. If the app allows constituents to post comments to a forum, those would attract hosting protection. If the MP selected and promoted a comment as Constituent Comment of the Day, he or she would have intervened sufficiently to lose hosting protection for that comment.

This activity-based drawing of the line is not an accident. It was the declared intention of the promoters of the Directive. The European Commission said in its Proposal for the Directive back in 1998:
"The distinction as regards liability is not based on different categories of operators but on the specific types of activities undertaken by operators. The fact that a provider qualifies for an exemption from liability as regards a particular act does not provide him with an exemption for all his other activities." 
Courts in Ireland (Mulvaney v Betfair), the UK (Kaschke v Gray, England and Wales Cricket Board v Tixdaq) and France (TF1 v Dailymotion) have reached similar conclusions (albeit in Tixdaq only a provisional conclusion).  Most authoritatively, the CJEU in L'Oreal v eBay states that a host that has acted non-neutrally in relation to certain data cannot rely on the hosting protection in the case of those data (judgment, para [116] - and see flowchart above).

The report of the Committee on Standards in Public Life on "Intimidation in Public Life" also discussed hosting liability.  It said:
“Parliament should reconsider the balance of liability for social media content. This does not mean that the social media companies should be considered fully to be the publishers of the content on their sites. Nor should they be merely platforms, as social media companies use algorithms that analyse and select content on a number of unknown and commercially confidential factors.”
Analysing and selecting user content so as to give the operator control over the selected content would exclude that content from hosting protection under the ECommerce Directive. The Committee's suggestion that such activities should have a degree of protection short of full primary publisher liability would seem to involve increasing, not decreasing, existing liability protection. That is the opposite of what, earlier in the Report, the Committee seemed to envisage would be required: “The government should seek to legislate to shift the balance of liability for illegal content to the social media companies away from them being passive ‘platforms’ for illegal content.”

Simple and complex platforms
The question of whether a hosting platform has behaved non-neutrally in relation to any particular content is also unrelated to the simplicity or complexity of the platform. The Directive has been applied to vanilla web hosting and structured, indexed platforms alike.  That is consistent with the contextual background to the Directive, which included court decisions on bulletin boards (in some ways the forerunners of today’s social media sites) and the Swedish Bulletin Boards Act 1998.

The fact that the ECD encompasses simple and complex platforms alike leads to a final point: the perhaps underappreciated variety of activities that benefit from hosting protection.  They include, as we have seen, online games and virtual worlds. They would include collaborative software development environments such as GitHub. Cloud-based word processor applications, any kind of app with a user-generated content element, website discussion forums, would all be within scope. By focusing on activities defined in a technology-neutral way the Directive has transcended and adapted to many different evolving industries and kinds of business.
The voluntary sector

Nor should we forget the voluntary world. Community discussion forums are (subject to one possible reservation) protected by the hosting shield.  The reservation is that the ECD covers services of a kind ‘normally provided for remuneration’. The reason for this is that the ECD was an EU internal market Directive, based on the Services title of the TFEU. As such it had to be restricted to services with an economic element. 
In line with EU law on the topic the courts have interpreted this requirement generously. Nevertheless there remains a nagging doubt about the applicability of the protection to purely voluntary activities.  The government could do worse than consider removing the "normally provided for remuneration" requirement so that the Mumsnets, the sports fan forums, the community forums of every kind can clearly be brought within the hosting protection.

[Amended 28 July 2018 with comment added after Matt Hancock quotation and addition of hosting liability flowchart.]




[1]               C. Angelopoulos, 'On Online Platforms and the Commission’s New Proposal for a Directive on Copyright in the Digital Single Market' (January 2017).

Friday, 27 April 2018

The IPAct data retention regime lives on (but will have to change before long)


The High Court gave judgment this morning on Liberty’s challenge to the mandatory communications data retention provisions of the Investigatory Powers Act (IPAct). 

The big questions in the Liberty case were:
  • What does the government have to do make the IPAct comply with EU law following the Tele2/Watson decision of the CJEU?
  • Has the government done enough in its proposed amendments to the IPAct, designed to address two admitted grounds of non-compliance with EU law?
  • When does it have to make changes?


In brief, the court has made a finding of non-compliance with EU law limited to the two grounds admitted by the government.  The court declared that Part 4 of the Investigatory Powers Act 2016 is incompatible with fundamental rights in EU law in that in the area of criminal justice:
(1) access to retained data is not limited to the purpose of combating “serious crime”; and
(2) access to retained data is not subject to prior review by a court or an independent administrative body.

As to timing to make changes, Liberty argued for no later than 31 July 2018 and the government for no earlier than 1 April 2019. The court decided that 1 November 2018 would be a reasonable time in which to amend the legal framework (albeit with a suggestion that practical implementation might take longer). In the meantime the existing IPAct data retention regime remains in effect, although lacking the two limitations and safeguards that have led to the admitted non-compliance with EU law.

The court observed, having noted that the question of appropriate remedy took the court into ‘deep constitutional waters’:
“… we are not prepared to contemplate the grant of any remedy which would have the effect, whether expressly or implicitly, of causing chaos and which would damage the public interest.
Nor do we consider that any coercive remedy is either necessary or appropriate. This is particularly so in a delicate constitutional context, where what is under challenge is primary legislation and where the Government proposes to introduce amending legislation which, although it will be in the form of secondary legislation rather than primary, will be placed before Parliament for the affirmative resolution procedure to be adopted.
On the other hand it would not be just or appropriate for the Court simply to give the Executive a carte blanche to take as long as it likes in order to secure compliance with EU law. The continuing incompatibility with EU law is something which needs to be remedied within a reasonable time. As long ago as July 2017 the Defendants conceded that the existing Act is incompatible with EU law in two respects.”

Turning to the main remaining grounds relied upon by Liberty:

1. Perhaps of greatest significance, the court rejected Liberty’s argument that the question of whether the legislation fell foul of the Tele2/Watson prohibition on general and indiscriminate retention of communications data should be referred to the CJEU. It noted a number of differences from the Swedish legislation considered in Tele2/Watson and concluded:

“In the light of this analysis of the structure and content of Part 4 of the 2016 Act, we do not think it could possibly be said that the legislation requires, or even permits, a general and indiscriminate retention of communications data. The legislation requires a range of factors to be taken into account and imposes controls to ensure that a decision to serve a retention notice satisfies (inter alia) the tests of necessity in relation to one of the statutory purposes, proportionality and public law principles.” The court declined to refer the point to the CJEU.

2. The question of whether national security is within the scope of the CJEU Watson decision would be stayed pending the CJEU’s decision in the reference from the Investigatory Powers Tribunal in the Privacy International case. The court declined to make a reference to the CJEU in these proceedings.

3. Liberty argued that a ‘seriousness’ threshold should apply to all other objectives permitted under Article 15(1) of the EU ePrivacy Directive, not just to crime. The court held that other than for criminal offences the fact that national legislation does not impose a “seriousness” threshold on a permissible objective for requiring the retention of data (or access thereto) does not render that legislation incompatible with EU law and that necessity and proportionality were adequate safeguards. It declined to refer the point to the CJEU.

4. A highly technical point about whether the CJEU Watson decision applied to ‘entity data’ as defined in the IPAct, or only to ‘events data’, was resolved in favour of the government.

5. Liberty argued that retention purposes concerned with protecting public health, tax matters, and regulation of financial services/markets and financial stability should be declared incompatible. The court declined to grant a remedy since the government intends to remove those purposes anyway.

6. As to whether mandatorily retained data has to be held within the EU, the court stayed that part of the claim pending the CJEU’s decision in the IPT reference in the Privacy International case.

7. The part of the claim regarding notification of those whose data has been accessed was also stayed pending the CJEU’s decision in the IPT reference in the Privacy International case.

By way of background to the decision, the IPAct was the government’s replacement for DRIPA, the legislation that notoriously was rushed through Parliament in 4 days in July 2014 following the CJEU’s nullification of the EU Data Retention Directive in Digital Rights Ireland.

DRIPA expired on 31 December 2016. But even as the replacement IPAct provisions were being brought into force it was obvious that they would have to be amended to comply with EU law, following the CJEU decision in Tele2/Watson issued on 21 December 2016.

A year then passed before the government published a consultation on proposals to amend the IPAct, admitting that the IPAct was non-compliant with EU law on the two grounds of lack of limitation to serious crime and lack of independent prior review of access requests. 

That consultation closed on 18 January 2018. Today’s judgment noted the government’s confirmation that legislation is due to be considered by Parliament before the summer recess in July 2018.

In the consultation the government set out various proposals designed to comply with Tele2/Watson:

-         A new body (the Office of Communications Data Authorisations) would be set up to give prior independent approval of communications data requests. These have been running at over 500,000 a year.

-         Crime-related purposes for retaining or acquiring events data would be restricted to serious crime, albeit broadly defined.

-         Removal of retention and acquisition powers for public health, tax collection and regulation of financial markets or financial stability.

The government's proposals were underpinned by some key interpretations of Tele2/Watson. The government contended in the consultation that:

-         Tele2/Watson does not apply to national security, so that requests by MI5, MI6 and GCHQ would still be authorised internally. That remains an outstanding issue pending the Privacy International reference to the CJEU from the IPT.

-         The current notice-based data retention regime is not 'general and indiscriminate'. It considered that Tele2/Watson's requirement for objective targeted retention criteria could be met by requiring the Secretary of State to consider, when giving a retention notice to a telecommunications operator, factors such as whether restriction by geography or by excluding a group of customers are appropriate.  Today’s Liberty decision has found in the government’s favour on that point. Exclusion of national security apart, this is probably the most fundamental point of disagreement between the government and its critics.

-         Tele2/Watson applies to traffic data but not subscriber data (events data but not entity data, in the language of the Act). Today’s decision upholds the government’s position on that.

-         Tele2/Watson does not preclude access by the authorities to mandatorily retained data for some non-crime related purposes (such as public safety or preventing death, injury, or damage to someone's mental health). That was not an issue in today’s judgment.

As to notification, the government considered that the existing possibilities under the Act are sufficient. It also considered that Tele2/Watson did not intend to preclude transfers of mandatorily retained data outside the EU where an adequate level of protection exists. These remain outstanding issues pending the Privacy International reference to the CJEU from the IPT.


Sunday, 1 April 2018

It’s no laughing matter - the case for regulating humour


The fallout from the Count Dankula ‘Nazi pug’ video prosecution shows no sign of abating.  While many have condemned the conviction as an assault on freedom of speech, others are saying that the law does not go far enough.  They argue that the criminal law only catches these incidents after the event when the harm has already been done. How can we prevent the harm being done in the first place?

“It is like pollution”, said one commentator. “We apply the precautionary principle to environmental harm, and we should do the same to prevent the toxic effects of tasteless, offensive and unfunny jokes on the internet. Freedom of speech is paramount, but we must not let that get in the way of doing what is right for society.”

The internet has only exacerbated the problem, say government sources. “So-called jokes going viral on social media are a scourge of society. Social media platforms have the resources to weed this out. They must do more, but so must society. Of course we have no quarrel with occasional levity, but serious humour such as satire is too dangerous to be left to the unregulated private sector. We would like to see this addressed by a self-regulatory code of conduct, but we are ready to step in with legislation if necessary.”

One professional comedian said: ‘This reaches a crisis point on 1 April each year, when tens of thousands of self-styled humourists try their hand at a bit of amateur prankstering. Who do they think they are fooling? An unthinking quip can have devasting consequences for the poor, the vulnerable, and for society at large. This is no joke. Controversial humour should be in the hands of properly qualified and trained responsible professionals.”

An academic added: “Humour is a public good. You only have to look at the standard of jokes on the internet to realise that the market is, predictably, failing to supply quality humour. We are in a race to the bottom. Since humour can also have significant negative externalities, the case for regulation is overwhelming.”

So there appears to be a growing consensus. Will we see a professional corps of licensed comedians?  Will amateur jokers find themselves in jail? Has this blogger succeeded only in proving that parody should be left to those who know what they are doing? Only time will tell.