Online Safety Bill (Seventeenth sitting) Debate

Full Debate: Read Full Debate
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The reason the new clause is drafted in that way is because I wanted to recognise the work of the Joint Committee and to take on board its recommendations. If it had been entirely my drafting, the House of Lords would certainly not have been involved, given that I am not the biggest fan of the House of Lords, as its Members are not elected. However, the decision was made to submit the new clause as drafted.

The Minister has said that the Government have not come to a settled view yet, which I am taking as the Minister not saying no. He is not standing up and saying, “No, we will definitely not have a Standing Committee.” I am not suggesting he is saying yes, but given that he is not saying no, I am happy to withdraw the new clause. If the Minister is keen to come forward at a future stage with suggestions for changes to Standing Orders, which I understand have to be introduced by the Leader of the House or the Cabinet Office, then they would be gladly heard on this side of the House. I beg to ask leave to withdraw the motion.

Clause, by leave, withdrawn.

New Clause 38

Adults’ risk assessment duties

“(1) This section sets out duties which apply in relation to internet services within section 67(2).

(2) A duty to take appropriate steps to keep an adults’ risk assessment up to date, including when OFCOM makes any significant change to a risk profile that relates to services of the kind in question.

(3) Before making any significant change to any aspect of a service’s design or operation, a duty to carry out a further suitable and sufficient adults’ risk assessment relating to the impacts of that proposed change.

(4) A duty to make and keep a written record, in an easily understandable form, of every risk assessment under subsections (2) and (3).

(5) An “adults’ risk assessment” of a service of a particular kind means an assessment of the following matters, taking into account the risk profile that relates to services of that kind—

(a) the user base;

(b) the level of risk of adults who are users of the service encountering, by means of the service, each kind of priority content that is harmful to adults (with each kind separately assessed).

(6) An “adults’ risk assessment” of a service of a particular kind means an assessment of the following matters, taking into account the risk profile that relates to services of that kind—

(a) the user base;

(b) the level of risk of adults who are users of the service encountering, by means of the service, each kind of priority content that is harmful to adults (with each kind separately assessed), taking into account (in particular) algorithms used by the service, and how easily, quickly and widely content may be disseminated by means of the service;

(c) the level of risk of harm to adults presented by different kinds of priority content that is harmful to adults;

(d) the level of risk of harm to adults presented by priority content that is harmful to adults which particularly affects individuals with a certain characteristic or members of a certain group;

(e) the level of risk of functionalities of the service facilitating the presence or dissemination of priority content that is harmful to adults, identifying and assessing those functionalities that present higher levels of risk;

(f) the different ways in which the service is used, and the impact of such use on the level of risk of harm that might be suffered by adults;

(g) the nature, and severity, of the harm that might be suffered by adults from the matters identified in accordance with paragraphs (b) to (f);

(h) how the design and operation of the service (including the business model, governance, use of proactive technology, measures to promote users’ media literacy and safe use of the service, and other systems and processes) may reduce or increase the risks identified.

(7) In this section references to risk profiles are to the risk profiles for the time being published under section 83 which relate to the risk of harm to adults presented by priority content that is harmful to adults.

(8) The provisions of Schedule 3 apply to any assessment carried out under this section in the same way they apply to any relating to a Part 3 service.”—(John Nicolson.)

This new clause applies adults’ risk assessment duties to pornographic sites.

Brought up, and read the First time.

John Nicolson Portrait John Nicolson (Ochil and South Perthshire) (SNP)
- Hansard - -

I beg to move, That the clause be read a Second time.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

New clause 39—Safety duties protecting adults—

“(1) This section sets out duties which apply in relation to internet services within section 67(2).

(2) A duty to summarise in the terms of service the findings of the most recent adults’ risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to adults).

(3) A duty to include provisions in the terms of service specifying, in relation to each kind of priority content that is harmful to adults that is to be treated in a way described in subsection (3), which of those kinds of treatment is to be applied.

(4) These are the kinds of treatment of content referred to in subsection (3)—

(a) taking down the content;

(b) restricting users’ access to the content.

(5) A duty to explain in the terms of service the provider’s response to the risks relating to priority content that is harmful to adults (as identified in the most recent adults’ risk assessment of the service), by reference to—

(a) any provisions of the terms of service included in compliance with the duty set out in subsection (3), and

(b) any other provisions of the terms of service designed to mitigate or manage those risks.

(6) If provisions are included in the terms of service in compliance with the duty set out in subsection (3), a duty to ensure that those provisions—

(a) are clear and accessible, and

(b) are applied consistently in relation to content which the provider reasonably considers is priority content that is harmful to adults or a particular kind of priority content that is harmful to adults.

(7) If the provider of a service becomes aware of any non-designated content that is harmful to adults present on the service, a duty to notify OFCOM of—

(a) the kinds of such content identified, and

(b) the incidence of those kinds of content on the service.

(8) In this section—

“adults’ risk assessment” has the meaning given by section 12;

“non-designated content that is harmful to adults” means content that is harmful to adults other than priority content that is harmful to adults.”

This new clause applies safety duties protecting adults to regulated provider pornographic content.

New clause 40—Duties to prevent users from encountering illegal content—

“(1) This section sets out duties which apply in relation to internet services within section 67(2).

(2) A duty to operate an internet service using proportionate systems and processes designed to—

(a) prevent individuals from encountering priority illegal content that amounts to an offence in either Schedule 6 or paragraphs 17 and 18 of Schedule 7 by means of the service;

(b) minimise the length of time for which the priority illegal content referred to in subsection (a) is present;

(c) where the provider is alerted by a person to the presence of the illegal content referred to in subsection (a), or becomes aware of it in any other way, swiftly take down such content.

(3) A duty to operate systems and processes that—

(a) verify the identity and age of all persons depicted in the content;

(b) obtain and keep on record written consent from all persons depicted in the content;

(c) only permit content uploads from verified content providers and must have a robust process for verifying the age and identity of the content provider;

(d) all uploaded content must be reviewed before publication to ensure that the content is not illegal and does not otherwise violate its terms of service;

(e) unloaded content must not be marketed by content search terms that give the impression that the content contains child exploitation materials or the depiction of non–consensual activities;

(f) the service must offer the ability for any person depicted in the content to appeal to remove the content in question.”

This new clause applies duties to prevent users from encountering illegal content to regulated providers of pornographic content.

John Nicolson Portrait John Nicolson
- Hansard - -

Big porn, or the global online pornography industry, is a proven driver of big harms. It causes the spread of image-based sexual abuse and child sexual abuse material. It normalises sexual violence and harmful sexual attitudes and behaviours, and it offers children easy access to violent, sexist and racist sexual content, which is proven to cause them a whole range of harms. In part, the Government recognised how harmful pornography can be to children by building one small aspect of pornography regulation into the Bill.

The Bill is our best chance to regulate the online pornography industry, which it currently does not mention. Over two decades, the porn industry has shown itself not to be trustworthy about regulating itself. Vanessa Morse, the head of the Centre to End All Sexual Exploitation, said:

“If we fail to see the porn industry as it really is, efforts to regulate will flounder.”

If the Minister has not yet read CEASE’s “Expose Big Porn” report, I recommend that he does so. The report details some of the harrowing harms that are proliferated by porn companies. Importantly, these harms are being done with almost zero scrutiny. We all know who the head of Meta or the chief executive officer of Google is, but can the Minister tell me who is in charge of MindGeek? This company dominates the market, yet it is almost completely anonymous—or at least the high heid yins of the company are.

New clause 38 seeks to identify pornography websites as providers of category 1 services, introduce a relevant code of practice and designate a specific regulator, in order to ensure compliance. Big porn must be made to stop hosting illegal extreme porn and the legal but harmful content prohibited by its own terms of service. If anyone thought that social media platforms were indifferent to a harm taking place on their site, they pale in comparison with porn sites, which will do the absolute minimum that they can. To show the extent of the horrible searches allowed, one video found by CEASE was titled “Oriental slave girl tortured”. I will not read out some of the other titles in the report, but there are search terms that promote non-consensual activity, violence, incest and racial slurs. For example, “Ebony slave girl” is a permitted term. This is just one of the many examples of damaging content on porn sites, which are perpetuating horrific sexual practices that, sadly, are too often being viewed by children.

Over 80% of the UK public would support strict new porn laws. I really think there is an appetite among the public to introduce such laws. The UK Government must not pass up this opportunity to regulate big porn, which is long overdue.