The abiding impression left by the government’s Initial Response
to the Online Harms White Paper Consultation is that it is half-finished. Non-conclusions
and conclusions sit side by side. Significant (or are they significant?) textual
departures from the White Paper are left unexplained. Variations on the
same text turn up in different parts of the document. Policy decisions pre-leaked
to the press (notably fines on directors) have been deferred for further
consideration. All in all, the response has a 'Greener' feel than the already
'Greenish' White Paper.
The government goes a long way towards admitting this:
A full response is expected in the Spring.
Fundamentally, however, the government has stuck to its guns: what is required in order to control what we say on the internet is broadcast-style regulation by regulator. (It is minded to appoint Ofcom to that role.)
It even took a slightly pained pop at those individuals who had the temerity to respond to the consultation by taking issue with the whole enterprise, rather than obediently answering each point of detail in turn.
The government goes a long way towards admitting this:
“This document forms an iterative part of the policy development process. We are committed to taking a deliberative and open approach to ensure that we get the detail of this complex and novel policy right. While it does not provide a detailed update on all policy proposals, it does give an indication of our direction of travel in a number of key areas raised as overarching concern across some responses.
In particular, … written responses and our engagement highlighted questions over a number of areas, including freedom of expression and the businesses in scope of the duty of care. Having carefully considered the information gained during this process, we have made a number of developments to our policies. …[W]e will continue to engage with users, industry and civil society as we continue to refine our policies ahead of publication of the full policy response.”
A full response is expected in the Spring.
Fundamentally, however, the government has stuck to its guns: what is required in order to control what we say on the internet is broadcast-style regulation by regulator. (It is minded to appoint Ofcom to that role.)
It even took a slightly pained pop at those individuals who had the temerity to respond to the consultation by taking issue with the whole enterprise, rather than obediently answering each point of detail in turn.
"A notable number of
individual respondents to the written consultation disagreed with the overall
proposals set out in the White Paper. Those respondents often seemed not to
engage with the substance of questions on the specific proposals, but instead
reiterated a general disagreement with the overall approach.”
Rather than suffer the ignominy of creating a section of
the response entitled 'Fundamental Objections', the government contented itself
with labelling the objectors' responses as confused:
“This was most notable in those
questions on regulatory advice, proportionality, the identity and funding of
the regulator, innovation and safety by design, which seemed to attract a
relatively large amount of confusion in responses. For these respondents, it
was therefore difficult to delineate between an objection to the overall regime
and an objection to the specific proposal within the question.".
On the other hand the government has, at least on the face
of it, made one significant policy change (or ‘development’, as it would have
it) by stating that, for lawful but potentially harmful content, intermediaries in
scope will have only to state clearly and accessibly what is allowed on their
platforms, then consistently and transparently enforce that policy.
Whether that version of the duty of care will necessarily turn out to be all that it seems is explored below. But in any event it applies
only for adult users. That seems to presuppose a system of age verification, if
the intermediary is to avoid the more onerous duties that would apply for lawful
content that could be seen by children. How that might operate, and how it
would mesh with the Information Commissioner’s Age Appropriate Design Code, is
anyone’s guess at present.
In the light of this distinction it would be
important that the line between unlawful and lawful content be
well understood; also that platforms draft their content policies in ways that
distinguish clearly between permitted and not permitted material. Those are
both significant challenges.
As to the first, assessing legality is rarely a question
of inspecting an item of content alone without an understanding of the factual context.
A court assesses evidence according to a standard of proof: balance of probabilities
for civil liability, beyond reasonable doubt for criminal. When it has
established the facts it has to make a decision on lawfulness.
Would the same process apply to the duty of care? Or would
the mere potential for illegality trigger the ‘unlawfulness’ duty of care, with
its accompanying obligation to remove user content? Over two years after the Internet Safety Green
Paper, and the best part of a year after the White Paper, the consultation
response contains no indication that the government recognises the existence of
this issue, let alone has started to grapple with it.
As for clarity of content policies, the task of framing speech laws
that pass the rule of law test of reasonable clarity and precision is no mean
one. A vague rule leads inevitably to arbitrary enforcement. No amount of
transparency and appeal procedures can remove the arbitrariness.
Preliminaries aside, what is the current state of the
government’s thinking? For a first take on the response, see here. This post takes
a microscope to some of the main topics, comparing the White Paper text with
that in the Response.
Scope of the duty of care
Scope has two aspects: (1) what kinds of activity or
organisation are in scope? (2) What kinds of harm are in scope? Both have an inclusive
and exclusive element: (a) what falls within the general definition? (b) Is
anything that would otherwise fall within that definition specifically
excluded?
Activities/organisation – general definition
|
White
Paper
|
|
Response
|
4
|
“companies
that provide services or tools that allow, enable or facilitate users to
share or discover user-generated content, or interact with each other online.
(Summary)
|
7
|
Our
Response, Businesses in scope: “companies that provide services or use
functionality on their websites which facilitate the sharing of user
generated content or user interactions, for example though comments, forums
or video sharing”
|
4.1
|
“companies
that allow users to share or discover user-generated content, or interact
with each other online.”
|
9
|
Regulatory
framework: “companies that provide services which facilitate the sharing
of user generated content or user interactions, for example though comments,
forums or video sharing”
|
4.2
|
Main
types of relevant service:
- Hosting, sharing and discovery of
user-generated content (e.g. a post on a public forum or the sharing of a
video).
- Facilitation of public and private online
interaction between service users (e.g. instant messaging or comments on
posts).
|
|
|
4.5
|
Those
providing ancillary services such as caching.
|
|
See Activities/organisations
Exclusions, below.
|
4.3
|
“social
media companies, public discussion forums, retailers that allow users to
review products online, along with non-profit organisations, file sharing
sites and cloud hosting providers.”
|
8
|
Our
Response, Businesses in scope: “To be in scope, a business would have to
operate its own website with the functionality to enable sharing of
user-generated content, or user interactions.”
“Just
because a business has a social media page that does not bring it within
scope of regulation.”
“It
would be the social media platform hosting the content that is in scope, not
the business using its services to advertise or promote their company.”
|
|
|
10
|
Regulatory
framework: “To be in scope, a business’s own website would need to
provide functionalities that enable sharing of user generated content or user
interactions.”
|
|
Messaging
services and search engines (Executive Summary)
|
|
|
|
|
7
|
Our
Response, Businesses in scope:
“To
ensure clarity, guidance will be provided by the regulator to help businesses
understand whether or not the services they provide or functionality
contained on their website would fall into the scope of the regulation.”
|
|
|
9
|
Regulatory
framework: “To ensure clarity, guidance would be provided to help
businesses understand whether or not the services they provide would fall into
the scope of the regulation.”
|
|
|
|
|
What has changed?
- The White Paper’s reference to ‘discover’ is omitted. The Response makes no reference to search engines. The inference is that search engines are now out of scope.
- ‘or tools’ is also omitted. The Response is silent as to the significance of this, although some narrowing of scope must be intended. This might exclude some kinds of ancillary service that were originally intended to be included. That impression is reinforced by the specific exclusion of ‘virtual infrastructure’ providers (see next section).
- ‘allow, enable’ omitted. Since what is left is ‘facilitate’, it is difficult to see that this results in any narrowing of scope.
- ‘interact with each other online’ is replaced with ‘user interactions’. The Response is silent as to the significance of this. Perhaps the intention is to make clear that online interactions that lead to or are combined with real world meetings or other interactions are in scope.
What has not changed?
- Application to comments, forums and content sharing of all kinds, when provided by companies. Retailers with review sections would still be in scope. Although the Response specifically mentions websites, the scope does not appear to be limited to web operations. As before it could include apps, online games and other online services.
- Private messaging services. These remain potentially in scope. The Response describes the submissions received on the questions of what should be regarded as private communications, and which private channels or forums should be in scope. It provides no policy conclusion. Overall, respondents to the written consultation opposed the inclusion in scope of private communication services.
- The press. Comments sections on newspaper websites remain caught by the general definition. Also, newspapers are themselves users of social media. They have Facebook pages and Twitter accounts, with links to their own websites. As such, their own content is liable to be affected by a social media platform taking action to suppress user content in performance of its duty of care. As to whether the Response provides a specific exclusion for the press or journalism, see next section.
- Whilst a business with a social media page would not itself come under the duty of care, the content on its social media page could be affected by actions taken by the platform under the platform’s own duty of care.Activities/organisation – exclusions
|
White
Paper
|
|
Response
|
|
|
|
|
|
|
|
Ministerial
Foreword: “Business to business services, which provide virtual
infrastructure to businesses for storing and sharing content, will not have
requirements placed on them.”
|
|
|
9
|
Executive
Summary: “Business-to-business services have very limited opportunities
to prevent harm occurring to individuals and as such will be out of scope of
regulation.”
|
|
|
11
|
Our
Response: Businesses in scope: “It is clear that business-to-business
services have very limited opportunities to prevent harm occurring to individuals
and as such remain out of scope of the Duty of Care.”
|
|
|
|
|
|
Post-White
Paper
|
|
|
|
Letter
from Secretary of State to Society of Editors: “… as I made clear at the
White Paper launch and in the House of Commons, where these services are
already well regulated, as IPSO and IMPRESS do regarding their members'
moderated comment sections, we will not duplicate those efforts.”
|
|
|
What has changed?
- The new business to business exclusion is significant, especially since the Ministerial Foreword to the Response stresses that this includes providers of virtual infrastructure to businesses. Clarification may be needed as to whether this would exclude provision of such services to organisations such as charities, not-for-profit organisations, political parties and so on.
What has not changed?
- Press and journalism remain in scope. Although the Response addresses freedom of expression (see below), it makes no proposal for carving out the press or journalism from the scope of the duty of care. Press and journalistic material on platforms (such as Facebook pages and Twitter feeds) will be within the scope of the platforms’ duty of care.Harm – general definition
|
White
Paper
|
|
Response
|
7
|
Executive
Summary: “content or activity that harms individual users, particularly
children, or threatens our way of life in the UK, either by undermining
national security, or by undermining our shared rights, responsibilities and
opportunities to foster integration.”
|
|
|
2
|
The harms
in scope: “online content or activity that harms individual users,
particularly children, or threatens our way of life in the UK, either by
undermining national security, or by reducing trust and undermining our
shared rights, responsibilities and opportunities to foster integration.”
|
|
|
2.2
|
“This
list is, by design, neither exhaustive nor fixed. A static list could prevent
swift regulatory action to address new forms of online harm, new
technologies, content and new online activities.”
|
15
|
Our
Response – Activities and organisations in scope: “While the White Paper
was clear that the list of harms provided was not intended to be exhaustive
or definitive, a number of organisations suggested specific harms…”
|
What has changed?
- Nothing.
What has not changed?
- The Response contains no comment on the description of harm in the White Paper, other than confirming that the list of harms set out in the White Paper was not intended to be exhaustive or definitive. There is no indication in the Response of an intent to adopt a more limited, or indeed any, general definition of harm in the legislation.
- The Response records the concerns of consultees about the scope of harm and the inclusion of legal ‘harms’, but does not come to a conclusion:“17. Many respondents expressed concerns around the potential for the scope of the regulator to be too broad or for it to have an adverse impact on freedom of expression. Many of these respondents, therefore, called for further clarification of services and harms in scope.”“20. At the same time, almost all industry respondents asked for greater clarity about definitions of harms, and highlighted the subjectivity inherent in identifying many of the harms, especially those which are legal. The majority of respondents objected to the latter being in scope.”Harm – exclusions
|
White
Paper
|
|
Response
|
7
|
Executive
Summary:
|
|
|
2.4
|
Excluded
from scope:
“All
harms to organisations, such as companies, as opposed to harms suffered by
individuals. This excludes harms relating to most aspects of competition law,
most cases of intellectual property violation, and the organisational
response to many cases of fraudulent activity.”
“All harms
suffered by individuals that result directly from a breach of the data
protection legislation, including distress arising from intrusion, harm from
unfair processing, and any financial losses.”
“All
harms suffered by individuals resulting directly from a breach of cyber
security or hacking.”
“all
harms suffered by individuals on the dark web rather than the open internet”
|
|
|
What has changed?
- Nothing.
What has not changed?
- The Response contains no comment on the exclusions in the White Paper.
- The Response records the suggestion of some consultees that the kinds of harm in scope should be broadened, but does not come to a conclusion:“15. A number of organisations suggested that economic harms (for instance, fraud) should be in scope.”As to economic harms, nothing in the White Paper’s general description of harms or in its list of specific exclusions appears to exclude economic harms as such. It excludes harms suffered by organisations, but that does not appear to exclude harms suffered by individuals as the result of fraud. Similarly, infringement of intellectual property rights owned by individuals appears to be in scope.Freedom of expression and journalism
|
White
Paper
|
|
Response
|
|
Ministerial
foreword “The UK is committed to a free, open and secure internet, and
will continue to protect freedom of expression online.”
|
|
Ministerial
foreword: “…freedom of expression, and the role of a free press, is vital
to a healthy democracy. We will ensure that there are safeguards in the
legislation, so companies and the new regulator have a clear responsibility
to protect users’ rights online, including freedom of expression and the need
to maintain a vibrant and diverse public square.”
|
36
|
Executive
Summary: “The regulator will have a legal duty to pay due regard to innovation,
and to protect users’ rights online, taking particular care not to infringe
privacy or freedom of expression. We are clear that the regulator will not be
responsible for policing truth and accuracy online.”
|
1
|
Our
Response – freedom of expression: “Safeguards for freedom of expression
have been built in throughout the framework. Rather than requiring the
removal of specific pieces of legal content, regulation will focus on the
wider systems and processes that platforms have in place to deal with online
harms, while maintaining a proportionate and risk-based approach.”
|
5
|
“The
regulator will have a legal duty to pay due regard to innovation, and to
protect users’ rights online, being particularly mindful to not infringe
privacy and freedom of expression.”
|
|
|
5.12
|
“The
regulator will also have an obligation to protect users’ rights online,
particularly rights to privacy and freedom of expression. It will ensure that
the new regulatory requirements do not lead to a disproportionately risk
averse response from companies that unduly limits freedom of expression,
including by limiting participation in public debate. Its regulatory action
will be required to be fair, reasonable and transparent.”
|
|
|
|
|
2
|
Our
Response – freedom of expression: “To ensure protections for freedom of
expression, regulation will establish differentiated expectations on
companies for illegal content and activity, versus conduct that is not
illegal but has the potential to cause harm. Regulation will therefore not
force companies to remove specific pieces of legal content. The new
regulatory framework will instead require companies, where relevant, to
explicitly state what content and behaviour they deem to be acceptable on
their sites and enforce this consistently and transparently. All companies in
scope will need to ensure a higher level of protection for children, and take
reasonable steps to protect them from inappropriate or harmful content.”
|
|
|
4
|
Our
Response – freedom of expression: “Recognising concerns about freedom of
expression, the regulator will not investigate or adjudicate on individual
complaints. Companies will be able to decide what type of legal content or
behaviour is acceptable on their services, but must take reasonable steps to
protect children from harm. They will need to set this out in clear and
accessible terms and conditions and enforce these effectively, consistently
and transparently. The proposed approach will improve transparency for users
about which content is and is not acceptable on different platforms, and will
enhance users’ ability to challenge removal of content where this occurs.”
|
|
|
16
|
Our
Response - Ensuring that the regulator acts proportionately: “The
consideration of freedom of expression is at the heart of our policy
development, and we will ensure that appropriate safeguards are included
throughout the legislation. By taking action to address harmful online
behaviours, we are confident that our approach will support more people to
enjoy their right to freedom of expression and participate in online
discussions.”
|
|
Post-White
Paper
|
|
|
|
Letter
from Secretary of State to Society of Editors: “Journalistic or editorial
content will not be affected by the regulatory framework.”
|
|
|
|
|
|
|
|
|
|
|
What has changed
- As regards scope, nothing. Journalistic and editorial content remain in scope. The Response records the views of some consultees:“14. Press freedom organisations and media actors also expressed the view that journalistic content should not be in scope, to protect freedom of expression and in accordance with established conventions of press regulation.”
- As regards freedom of expression, the Response records the views of some consultees on scope:15. Many civil society organisations also raised concerns about the inclusion of harms which are harder to identify, such as disinformation, citing concerns of the impact this could have on freedom of expression.The government’s Response on freedom of expression contains perhaps the most significant policy development compared with the White Paper. Although a close examination of the content of each of the Codes of Practice proposed in the White Paper might have suggested a leaning towards restricting removal and filtering to illegal, as opposed to legal but harmful, content there was no clear commitment to a different approach for the two categories. That has now changed, with the proposal for a differentiated duty of care.This change, with its emphasis on the freedom of intermediaries to decide what lawful content to permit adults to see, resembles some aspects of the submissions made by the LSE’s Damian Tambini “Reducing Online Harms through a Differentiated Duty of Care: A Response to the Online Harms White Paper” (June 2019) and, perhaps to a lesser extent, by Mark Bunting of Communications Chambers (July 2019).However, is this policy development all that it appears to be? Two points spring to mind:First, the intermediaries’ freedom is restricted to content seen by adults. But at the same time “All companies in scope will need to ensure a higher level of protection for children, and take reasonable steps to protect them from inappropriate or harmful content.” If there is any possibility that the audience includes children (which, under UK law, includes anyone under the age of 18), does that trigger stricter duty of care requirements? If so, does that render illusory the apparent freedom to decide on content? The Response says:
“Under our proposals we expect companies to use a proportionate range of tools including age assurance, and age verification technologies to prevent children from accessing age-inappropriate content and to protect them from other harms.”
Would non-compliance with age verification standards laid down by the regulator automatically disapply the intermediary’s freedom to determine what is permissible on its platform?Second, what would be the regulator’s supervisory remit? The two versions of the text in the Response refer to the intermediary having to enforce its terms and conditions “consistently and transparently” and “effectively, consistently and transparently”. Would (as at any rate the first version appears to suggest) the regulator’s remit be strictly limited to assessment by reference to the content standards set in the intermediary’s own terms and conditions? Or could the regulator go beyond that and assess effectiveness in reducing harm?If the latter, then the regulator could enter the realm of (for instance) requiring algorithms to be designed so as to make particular kinds of lawful content less readily accessible to users, recommendation algorithms to be tweaked, dissemination tools to be modified and so on – all while still adhering to the letter of the government’s commitment that the platform has freedom to decide what material is permitted on its platform. This would take the role of the regulator more into the territory envisaged by the Carnegie Trust UK’s proposals.This question is especially relevant when we consider that the Response signals a shift away from a series of Codes of Practice grouped by reference to difference kinds of harm – “We do not expect there to be a code of practice for each category of harmful content.” Instead, “Rather than requiring the removal of specific pieces of legal content, regulation will focus on the wider systems and processes that platforms have in place to deal with online harms.” Is that restricted to matters such as user redress and reporting mechanisms, discussed in the Response, or might it go further into algorithms and user tools?This is against the background that nothing in the government’s Response is set in stone. It is indicative of a direction of travel, and forms part of a process of iterative policy development. There is no guarantee that the direction of travel will not change, or go into reverse, as the process continues.Illegal content
|
White
Paper
|
|
Response
|
32
|
Executive
Summary: “Every company within scope will need to fulfil their duty of
care, particularly to counter illegal content and activity”
|
|
|
41
|
Executive
Summary: The new regulatory framework will increase the responsibility of
online services in a way that is compatible with the EU’s e-Commerce
Directive, which limits their liability for illegal content until they have
knowledge of its existence, and have failed to remove it from their services
in good time.
|
|
|
|
|
3
|
Our
response – freedom of expression: “Services in scope of the regulation
will need to ensure that illegal content is removed expeditiously and that
the risk of it appearing is minimised by effective systems. Reflecting the
threat to national security and the physical safety of children, companies
will be required to take particularly robust action to tackle terrorist
content and online child sexual exploitation and abuse.”
|
|
|
5
|
Regulatory
Framework: “In-scope services will need to ensure that illegal content is
removed expeditiously and that the risk of it appearing is minimised by
effective systems. Reflecting the threat to national security and the
physical safety of children, companies will be required to take particularly
robust action to tackle terrorist content and online child sexual
exploitation and abuse.”
|
- What has changed (or not)Proactive prevention was mentioned in a number of the indicative topic lists for content-specific Codes of Practice in the White Paper. The Response does not mention the ECommerce Directive.User redress mechanisms
|
White
Paper
|
|
Response
|
3.26
|
“To
fulfil the new duty of care, we will expect companies, where appropriate, to
have an effective and easy-to-access complaints function, allowing users to
raise either concerns about specific pieces of harmful content or activity,
or wider concerns that the company has breached its duty of care. Users
should receive timely, clear and transparent responses to their complaints,
and there must be an internal appeals function.”
|
|
|
|
|
5
|
Our
response – Freedom of expression: “Companies will be required to have
effective and proportionate user redress mechanisms which will enable users to
report harmful content and to challenge content takedown where necessary.
This will give users clearer, more effective and more accessible avenues to
question content takedown, which is an important safeguard for the right to
freedom of expression. These processes will need to be transparent, in line
with terms and conditions, and consistently applied.”
|
|
|
26
|
Regulatory
framework – User redress: “Recognising concerns about freedom of
expression, while the regulator will not investigate or adjudicate on
individual complaints, companies will be required to have effective and
proportionate user redress mechanisms which will enable users to report
harmful content and to challenge content takedown where necessary. This will
give users clearer, more effective and more accessible avenues to question content
takedown, which is an important safeguard for the right to freedom of
expression. These processes will need to be transparent, in line with terms
and conditions, and consistently applied.
|
|
|
|
|
- No substantive change
No comments:
Post a Comment
Note: only a member of this blog may post a comment.