In a few months’ time three years will have passed since the French Constitutional Council struck down the core provisions of the Loi Avia - France’s equivalent of the German NetzDG law – for incompatibility with fundamental rights. Although the controversy over the Loi Avia has passed into internet history, the Constitutional Council's decision provides some instructive comparisons when we examine the UK’s Online Safety Bill.
As the Bill awaits
its House of Lords Committee debates, this is an opportune moment to cast our
minds back to the Loi Avia decision and see what lessons it may hold. Caution
is necessary in extrapolating from judgments on fundamental rights, since they are highly fact-specific; and when they do lay down principles they tend to leave cavernous room for future interpretation. Nevertheless,
the Loi Avia decision makes
uncomfortable reading for some core aspects of the Online Safety Bill.
Background
The key features of the Loi Avia were:
- For illegal CSEA and terrorism content, one hour removal of content notified to an in-scope publisher or host by the administrative authority, on pain of one year’s imprisonment and a 250,000 euro fine.
The Constitutional Council’s objection was founded on the determination of illegality being at the sole discretion of the administrative authority. This provision has no direct parallel in the Online Safety Bill. However, similar considerations could come into play should an Ofcom Code of Practice recommend giving state agencies some kind of trusted flagger status.
- For content contravening specified hate-related, genocide-related, sexual harassment and child pornography laws, 24-hour removal of manifestly illegal content following notification by any person to an in-scope platform operator, under penalty of a fine of 250,000 euros.
The Online Safety Bill analogue is a reactive ‘swift take down’ duty on becoming aware of in-scope illegal content. Unlike the Loi Avia, the Bill also imposes proactive prevention duties.
The Online Safety
Bill imposes duties for both illegal content and legal content harmful to
children. Since the Loi Avia concerned only illegal content, the Constitutional
Council did not have to consider obligations relating to ‘legal but harmful’
content of any kind, whether for adults or children.
Lesson 1: The rule
of law comes first
The tests that the
Constitutional Council applied to the Loi Avia – legality, necessity and
proportionality – are components of the European Convention on Human Rights,
with which the Online Safety Bill must comply.
Along the obstacle course of human rights compatibility, the first hurdle is legality: known in the ECHR as the “prescribed by law” test. In short, a law must have the quality of law to qualify as law. If the law does not enable someone to foresee with reasonable certainty whether their proposed conduct is liable to be affected as a consequence of the law, it falls at that first hurdle. If legislation will result in arbitrary or capricious decisions - for example through vagueness or grant of excessive discretion - it lacks the essential quality of law.
The problem with vagueness
was spelt out by the House of Lords in R v Rimmington, citing the US
case of Grayned:
"Vagueness
offends several important values … A vague law impermissibly delegates basic
policy matters to policemen, judges and juries for resolution on an ad hoc and
subjective basis, with the attendant dangers of arbitrary and discriminatory
application."
Whilst most often
applied to criminal liability, the legality objection has also been described
as a constitutional principle that underpins the rule of law generally. Lord
Diplock referred to it in a 1975 civil case (Black-Clawson):
"The
acceptance of the rule of law as a constitutional principle requires that a
citizen, before committing himself to any course of action, should be able to
know in advance what are the legal consequences that will flow from it."
The French Constitutional
Council held that the Loi Avia failed the legality test in one respect. The Loi
provided that the intentional element of the offence of failure to remove
content notified by any person could arise from absence of a “proportionate and
necessary examination of the notified content”. The Constitutional Council
found that if this was intended to provide a defence for platform operators, it
was not drafted in terms that allowed its scope to be determined. In other
words, a defence (if that is what it was) of having carried out a proportionate
and necessary examination was too vague to pass the legality test.
The Online Safety Bill differs from the Loi Avia. It does not impose criminal liability on a platform for failure to take down a particular item of user content. Enforcement by the appointed regulator, Ofcom, is aimed at systematic failures to fulfil duties rather than at individual content decisions. Nevertheless, the Bill is liberally sprinkled with references to proportionality – similar language to that which the French Constitutional Council held was too vague. It typically couches platform and search engine duties as an obligation to use proportionate systems and processes designed to achieve a stipulated result.
It is open to question
whether compliance with the legality principle can be achieved simply by
inserting ‘proportionate’ into a broadly stated legal duty, instead of grasping
the nettle of articulating a more concrete obligation that would enable the
proportionality of the interference with fundamental rights to be assessed by a
court.
The government’s
ECHR Memorandum seeks to head off any objection along these lines by stressing
the higher degree of certainty that it expects would be achieved when Ofcom’s
Codes of Practice have been laid before Parliament and come into effect. Even if that does the trick, it is another matter whether
it is desirable to grant that amount of discretion over individual speech to a
regulator such as Ofcom.
For the Online
Safety Bill the main relevance of the legality hurdle is to the freedom of
expression rights of individual users. Can a user foresee with reasonable
certainly whether their proposed communication is liable to be affected as a
result of a platform or search engine seeking to fulfil a safety duty imposed
by the legislation? The Bill requires those online intermediaries to play detective,
judge and bailiff. Interpolation of an online intermediary into the process of
adjudging and sanctioning user content is capable of introducing arbitrariness that is not present when the same offence is prosecuted through the courts, with their
attendant due process protections.
In the case of the
Online Safety Bill, arbitrariness is a real prospect. That is largely because of
the kinds of offences on which platforms and search engines are required to
adjudicate, the limited information available to them, and the standard to which they have to be satisfied that the user content is illegal.
Lesson 2: Beyond ‘manifestly
illegal’
An intriguing feature of the Constitutional Council decision is that although the Loi Avia prescribed, on the face of it, a high threshold for removal of illegal content – manifest illegality – that was not enough to save the legislation from unconstitutionality. ‘Manifestly illegal’ is a more stringent test than the ‘reasonable grounds to infer’ threshold prescribed by the Online Safety Bill.
The Loi Avia
required removal of manifestly illegal user content within 24 hours of
receiving from anyone a notification which gave the notifier’s identity, the
location of the content, and which specified the legal grounds on which the
content was said to be manifestly illegal.
The Constitutional
Council observed that the legislation required the operator to examine all
content reported to it, however numerous the reports, so as not to risk being
penalised. Moreover, once reported the platform had to consider not only the
specific grounds on which the content was reported, but all offences within the
scope of the legislation – even though some might present legal technicalities
or call for an assessment of context. These issues were especially significant
in the light of the 24 hour removal deadline and the criminal penalty for each
failure to withdraw.
In the Constitutional
Council’s view the consequence of these provisions, taking into account also the
absence of any clearly specified defence to liability, was that operators could
only be encouraged to withdraw content reported to them, whether or not it was
manifestly illegal. That was not necessary, appropriate or proportionate and so
was unconstitutional.
The Online Safety
Bill does not prescribe specific time limits, but requires swift removal of
user content upon the platform becoming aware of in-scope illegality. As with
the Loi Avia, that applies to all in-scope offences.
The touchstone for
assessment of illegality under the Bill is reasonable grounds to infer
illegality, on the basis of all information reasonably available to the
platform. Unless that threshold is surmounted, the platform does not have to
remove it. If it is surmounted, the platform must do so swiftly.
At least in the case
of automated proactive monitoring and filtering, the available information will
be minimal – the users’ posts themselves and whatever the system knows about the relevant users. As a consequence, the
decisions required to be made for many kinds of offence – especially those
dependent on context - will inevitably be arbitrary. Moreover, a platform has to
ignore the possibility of a defence unless it has something from which it can
infer on reasonable grounds that a defence may succeed.
Whilst the Online
Safety Bill lacks the Loi Avia’s chilling sword of Damocles of short prescriptive
deadlines and automatic criminal liability for failure to remove, the reason why
those factors (among others) were legally significant was their effect on the freedom
of expression of users: the likely over-removal of lawful user content. The Online
Safety Bill’s lower threshold for adjudging illegality, combined with the
requirement to make those judgments in a relative information vacuum - often at
scale and speed - does more than just encourage takedown of legal user content:
it requires it.
Lesson 3 –The lens
of prior restraint
The briefly glimpsed
elephant in the room of the Loi Avia decision is prior restraint. The Constitutional
Council alluded to it when it remarked that the removal obligations were not
subject to the prior intervention of a judge or subject to any other condition.
Legislation
requiring a platform summarily to adjudge the legality of individual items of
user content at speed and at scale bears the hallmarks of prior restraint:
removal prior to full adjudication on the merits after argument and evidence.
Prior restraint is
not impermissible. It does require the most stringent scrutiny and
circumscription, in which the risk of removal of legal content will loom large.
The ECtHR in Yildirim considered an interim court order blocking Google
Sites. It characterised that as a prior
restraint, and observed: “the dangers inherent in prior restraints are such
that they call for the most careful scrutiny on the part of the Court”.
The ECtHR in Animal Defenders v UK distinguished a prior restraint imposed on an individual
act of expression from general measures: in that case a ban on broadcasting
political advertising.
If an individual item is removed ultimately pursuant to a general measure, that does not prevent the action being characterised as a prior restraint. If it did, the doctrine could not
be applied to courts issuing interim injunctions. The fact that the Online
Safety Bill does not penalise a platform for getting an individual decision
wrong does not disguise the fact that the required task is to make judgments
about individual items of user content constituting individual acts of
expression.
The appropriateness of
categorising at least proactive detection and filtering obligations as a form
of prior restraint is reinforced by the CJEU decision in Poland v The European Parliament and Council, which applied Yildirim to those
kinds of provisions in the context of copyright.
Lesson 4 – Context,
context, context
The Constitutional Council
pointed out the need to assess context for some offences. That is all the more
significant for the Online Safety Bill, for several reasons.
First, unlike the
Loi Avia the Online Safety Bill imposes proactive, not just reactive, duties.
That will multiply the volume of user content to be assessed, in many cases
requiring the deployment of automated content monitoring. Such systems, by
their very nature, can be aware only of content flowing through the system and
not of any external context.
Second, the Bill
requires illegality assessments to be made ignoring external contextual
information unless it is reasonably available to the platform.
Third, defences such
as reasonableness will often be inherently contextual. The Bill, however,
enables the intermediary to take account of the possibility of a defence only
if it has information on the basis of which it can infer that a defence may
successfully be relied upon.
Lesson 5 – Proactive
duties
The Loi Avia
decision was about reactive duties based on notification. Proactive illegality duties
present inherently greater human rights challenges. A less prescriptive, less
draconian reactive regime, combined with a ‘manifest illegality’ standard and
greater due process safeguards, might possibly have survived. But if the
starting point is aversion to a regime that encourages takedown of legal
user content, it is difficult to see how a regime that carries a certainty of
over-takedown, as do the Online Safety Bill’s proactive illegality duties, could
pass muster.
What is to be done?
Raising the Online Safety Bill’s standard of assessment from reasonable grounds to infer to manifest illegality would go some way towards a better prospect of human rights compliance. But that still leaves the problem of the assessment having to be made in ignorance of external context; and the problem of the possibility of a defence being discounted unless it is apparent from the information flowing through the system. Those more intractable issues put in question the kinds of offences that platforms and search engines could be called upon to adjudge.