Part 3 of a short series of reflections on Ofcom’s Illegal Harms consultation under the Online Safety Act 2023 (OSA).
This post analyses the illegality duties created by the OSA. That is not as simple as one might hope. The intrepid reader has to hack their way through a thicket of separately defined, subtly differing, duties. Meanwhile they must try to cope with a proliferation of competing public narratives about what the Act is - or ought to be - doing, such as the suggestion that the regime is really about systems and processes, not content.
The specific safety duties imposed by the OSA are the logical starting point for an analysis of what the Act requires of service providers. I have categorised the duties in a way which, it is hoped, will illuminate how Ofcom has approached the consultation.
At the highest level the Act’s illegality duties are either substantive or risk assessment. The substantive duties may require the service provider to take measures affecting the design or operation of the service. The outcomes of the obligatory risk assessment feed into some of the substantive duties.
In my proposed categorisation the substantive service provider duties fall into three categories: content-based, non-content-based and harm-based. A content-based duty is framed solely by reference to illegal content, with no mention of harm. A non-content-based duty makes no mention of either content or harm, but may refer to illegal activity. A harm-based duty is framed by reference to harm arising from or caused by illegal content or activity.
The tables below divide the various substantive duties into these categories.
The risk assessment duties are a more extensive and complex mixture of duties framed by reference to illegal content, illegal activities and harm.
The differences in framing are significant, particularly for the substantive duties, since they dictate the kinds of measures that could satisfy the different kinds of duty. For instance:
An operational substantive content-based duty is focused on identifying and addressing items of illegal content — such as by reactively removing and taking them down. Recommended measures have to reflect that. (But see discussion below of the difference between a design duty and an operational duty in the context of a proactive, preventive duty.)
A substantive duty to mitigate risk of harm is less focused. A recommended measure would not necessarily involve evaluating the illegality of content. It could consist of a measure that either does not affect content at all, or does so content-agnostically. A reporting button would be an example of such a measure, as would a cap on the permissible number of reposts for all content across the board.
But a recommended measure within this category might yet have content-related aspects. It could be suggested, for instance, that in order to mitigate the risk of harm arising from a particular kind of illegal content the provider should identify content of that kind and take recommended steps in relation to it.
Similarly a risk assessment duty could involve the service provider in evaluating individual items of content, even if the duty ranges more widely:
“Your risk assessment should not be limited to an assessment of individual pieces of content, but rather consider how your service is used overall. In particular, the requirement to assess the risk of the service being used to commit or facilitate priority offences may mean considering a range of content and behaviour that may not amount to illegal content by itself.” [Service Risk Assessment Guidance, Boxout preceding para A5.42.]
It can be seen in the table below that the harm-based risk assessment duties are further divided into content-related and non-content related. The former all make some reference to illegal content in the context of a harm-based duty.
Within the framework of the Act, not all illegality is necessarily harmful (in the Act's defined sense) and not all kinds of harm are in scope. There is a conceptual separation between illegality per se, illegality that causes (or risks causing) harm as defined, and harm that may arise otherwise than from illegality. Those category distinctions have to be borne in mind when analysing the illegality duties imposed by the Act. It bears repeating that when the Act refers to harm in connection with service provider duties it has a specific, limited meaning: physical or psychological harm.
So equipped, we can categorise and tabulate the illegality duties under the Act. For simplicity the table shows only the duties applicable to all U2U service providers, starting with the substantive duties:
Category | Description | OSA section reference |
Substantive duties | ||
Content-based | Proportionate measures relating to design or operation of service to prevent individuals encountering priority illegal content by means of the service. | 10(2)(a) |
Operate service using proportionate systems and processes designed to minimise the time for which priority illegal content is present. | 10(3)(a) | |
Operate service using proportionate systems and processes designed to swiftly take down illegal content where the provider is alerted or otherwise becomes aware of its presence | 10(3)(b) | |
Non-content-based | Proportionate measures relating to design or operation of service to effectively mitigate and manage the risk of the service being used for the commission or facilitation of a priority offence identified in the most recent illegal content risk assessment. | 10(2)(b) |
Harm-based | Proportionate measures relating to design or operation of service to effectively mitigate and manage the risks of harm to individuals identified in the most recent illegal content risk assessment (see 9(2)(g)). | 10(2)(c) |
These are the illegality risk assessment duties:
Category | Description | OSA section reference |
Risk assessment duties | ||
Content-based | Assess level of risk of users encountering illegal content of each kind | 9(5)(b) |
Assess level of risk of functionalities facilitating presence or dissemination of illegal content | 9(5)(e) | |
Non-content-based | Assess level of risk of service being used for commission or facilitation of a priority offence | 9(5)(c) |
Assess level of risk of functionalities facilitating use of service for commission or facilitation of a priority offence | 9(5)(e) | |
Assess how the design and operation of the service may reduce or increase the risks identified. | 9(5)(h) | |
Harm-based (content-related) | Assess level of risk of harm to individuals presented by illegal content of different kinds | 9(5)(d) |
Assess nature and severity of harm that might be suffered by individuals from identified risk of harm to individuals presented by illegal content of different kinds | 9(5)(d) and (g) | |
Assess nature and severity of harm that might be suffered by individuals from identified risk of individual users encountering illegal content | 9(5)(b) and (g) | |
Assess nature and severity of harm that might be suffered by individuals from identified risk of functionalities facilitating presence or dissemination of illegal content | 9(5)(e) and (g) | |
Harm-based (non-content-related) | Assess level of risk of harm to individuals presented by use of service for commission or facilitation of a priority offence | 9(5)(d) |
Assess nature and severity of harm that might be suffered by individuals from identified level of risk of harm to individuals presented by use of service for commission or facilitation of a priority offence | 9(5)(d) and (g) | |
Assess nature and severity of harm that might be suffered by individuals from identified risk of service being used for commission or facilitation of a priority offence | 9(5)(c) and (g) | |
Assess different ways in service is used and impact of such use on level of risk of harm that might be suffered by individuals | 9(5)(f) | |
Assess nature and severity of harm that might be suffered by individuals from identified risk of different ways in service is used and impact of such use on level of risk of such harm | 9(5)(f) and (g) |
The various harm-based illegality risk assessment duties feed in to the substantive harm mitigation and management duty of S.10(2)(c). That duty stands independently of the three content-based and one non-content-based illegality duties, as can be seen from the tables and is illustrated in this visualisation.
Content or systems and processes?
In March last year the Ofcom CEO drew a contrast between systems and processes and a content regime, suggesting that the OSA is about the former and not really the latter (see Introduction).
The idea that the Act is about systems and processes, not content, prompts a close look at the differences in framing of the substantive illegality duties. As can be seen from the table above, the proactive prevention duty in S.10(2)(a) requires the service provider to:
“to take or use proportionate measures relating to the design or operation of the service to… prevent individuals from encountering priority illegal content by means of the service” (emphasis added)
Thus a measure recommended by Ofcom to comply with this duty could be limited to the design of the service, reflecting the ‘safe by design’ ethos mentioned in Section 1(3) of the Act. As such, the S.10(2)(a) duty, although content-based, has no necessary linkage to assessing the illegality of individual items of user content posted to the service. That possibility, however, is not excluded. A recommended proactive measure could involve proactively detecting and blocking individual items, as contemplated by the section’s reference to operational measures.
In contrast, the content removal duty in S.10(3)(b) is specifically framed in terms of use of operational systems and processes:
“(3) A duty to operate a service using proportionate systems and processes designed to … (b) where the provider is alerted by a person to the presence of any illegal content, or becomes aware of it in any other way, swiftly take down such content” (emphasis added)
In view of this drafting it is not surprising that, notwithstanding the reference to proportionate systems and processes, Ofcom in several places characterises the Act as imposing a simple duty to remove illegal content:
"When services make an illegal content judgement in relation to particular content and have reasonable grounds to infer that the content is illegal, the content must however be taken down." [para 26.14, Illegal Content Judgements Guidance discussion]
“Within the illegal content duties there are a number of specific duties. As part of the illegal content safety duty at section 10(3)(b) of the Act, there is a duty for a user-to-user service to “swiftly take down” any illegal content when it is alerted to the presence of it (the ‘takedown duty’).” [para A1.14, draft Illegal Content Judgements Guidance]
“As in the case of priority illegal content, services are required to take down relevant nonpriority illegal content swiftly when they are made aware of it.” [para 2.41, Volume 1]
“Where the service is alerted by a person to the presence of any illegal content, or becomes aware of it in any other way, services must swiftly take down such content (section 10(3)(b)).” [para 12.6, Volume 4]
The framing of the S.10(3)(b) duty is tightly focused on the removal actions to be taken by a service provider in relation to individual items of content when it becomes aware of them. The service provider must have proportionate systems and processes designed to achieve the result (take down), and must use them in the operation of the service.
This duty inescapably requires individual judgements to be made. It does not, however, from a supervision and enforcement point of view, require a service provider to get every individual judgement right.
From the perspective of the service provider the focus of Ofcom’s supervision and enforcement will indeed be on its systems and processes:
“It is important to make clear that, as the regulator, Ofcom will not take a view on individual pieces of online content. Rather, our regulatory approach is to ensure that services have the systems and processes in place to meet their duties.” [Overview, p.17; Volume 4, p.19]
“In line with the wider approach to delivering online safety, supervision will focus on the effectiveness of services’ systems and processes in protecting their users, not on individual pieces of content.” [para 30.5, Volume 6]
That does not mean however, that (as Ofcom appears to suggest in the following extract) the regulatory regime is not about regulating individual user content:
“It is important to note that the Online Safety regime is about service providers’ safety systems and processes, not about regulating individual content found on such services. The presence of illegal content or content that is potentially harmful to children does not necessarily mean that a service provider is failing to fulfil its duties in the Act. We would not therefore be likely to take action solely based on a piece of harmful content appearing on a regulated service.” [Enforcement Guidance, para A3.6]
The purpose of at least the operational content-based aspects of the OSA regime is to harness the control that intermediary service providers can exercise over their users in order (indirectly) to regulate individual items of user content. The fact that a service provider may not be penalised for an individual misjudgement does not alter the fact that the service provider has to have a system in place to make judgements and that the user in question stands to be affected when their individual post is removed as a result of a misjudgement. The operational content-based duties exist and are about regulating individual content.
Parenthetically, it is also perhaps doubtful whether Ofcom can, in reality, perform its role entirely in the abstract and avoid making its own judgements on individual items of content. For instance, if prevalence of illegal material on a service is relevant to Ofcom’s enforcement priority assessment, how could that be determined without evaluating whether individual items of user content on the service are illegal?
Some may believe that Ofcom's approach is too heavily weighted towards individual content judgements by service providers. However, it is difficult to see how such content-based measures can be avoided when content-based duties — removal and takedown of individual items of illegal content in the operation of the service - are hardwired into the Act, requiring the exercise of judgement to determine whether the content in question is or is not illegal.
That said, in the area of the S.10(2) proactive preventive measures Ofcom has, as already alluded to, much broader scope to assess proportionality and decide what kinds of measures to recommend (or not) and what safeguards should accompany them. That will be the subject of Part 4 of this series.
No comments:
Post a Comment
Note: only a member of this blog may post a comment.