The Committee consisted of the following Members:
Chairs: Sir Roger Gale, † Christina Rees
† Ansell, Caroline (Eastbourne) (Con)
† Bailey, Shaun (West Bromwich West) (Con)
† Blackman, Kirsty (Aberdeen North) (SNP)
† Carden, Dan (Liverpool, Walton) (Lab)
† Davies-Jones, Alex (Pontypridd) (Lab)
† Double, Steve (St Austell and Newquay) (Con)
† Fletcher, Nick (Don Valley) (Con)
† Holden, Mr Richard (North West Durham) (Con)
† Keeley, Barbara (Worsley and Eccles South) (Lab)
† Leadbeater, Kim (Batley and Spen) (Lab)
† Miller, Dame Maria (Basingstoke) (Con)
† Mishra, Navendu (Stockport) (Lab)
Moore, Damien (Southport) (Con)
† Nicolson, John (Ochil and South Perthshire) (SNP)
† Philp, Chris (Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport)
† Russell, Dean (Watford) (Con)
† Stevenson, Jane (Wolverhampton North East) (Con)
Katya Cassidy, Kevin Maddison, Seb Newman, Committee Clerks
† attended the Committee
Public Bill Committee
Tuesday 28 June 2022
(Afternoon)
[Christina Rees in the Chair]
Online Safety Bill
New Clause 26
Report on synthetic media content harms
“(1) The Secretary of State must publish and lay before Parliament a report on the harms caused to users by synthetic media content appearing on regulated services.
(2) The report must contain analysis of the harms caused specifically to individuals working in the entertainment industry, including, but not limited to, infringements of their intellectual property rights.
(3) The report must be published within six months of this Act being passed.
(4) In this section, “synthetic media content” means any content that has been produced or modified by automated means.”—(Alex Davies-Jones.)
This new clause would require the Secretary of State to publish and lay before Parliament a report on the harms caused to users by synthetic media content (aka “deepfakes”). The report must contain particular reference to the harms caused to those working in the entertainment industry.
Brought up, read the First time, and motion made (this day), That the clause be read a Second time.
14:00
Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

Before we adjourned, I was discussing the Government’s national artificial intelligence strategy and the two separate consultations launched by the Government to look at the intellectual property system in relation to AI. In those consultations, the Intellectual Property Office recognised that AI

“is playing an increasing role in...artistic creativity.”

However, specific questions about reviewing or enhancing performers’ rights were notably absent from both Government consultations. If the UK Government really want to make Britain a global AI and creative superpower, strengthening the rights of performers and other creatives must be at the heart of the national AI strategy.

Another key challenge is that our intellectual property framework is desperately out of date. Currently, performers have two sets of rights under the Copyright, Designs and Patents Act 1988: the right to consent to the making of a recording of a performance; and the right to control the subsequent use of such recordings, such as the right to make copies. However, as highlighted by Dr Mathilde Pavis, senior lecturer in law at the University of Exeter, AI-made performance synthetisation challenges our intellectual property framework because it reproduces performances without generating a recording or a copy, and therefore falls outside the scope of the Act. An unintended consequence is that people are left vulnerable to abuse and exploitation. Without effective checks and balances put in place by the Government, that will continue. That is why 93% of Equity members responding to a recent survey stated that the Government should introduce a new legal protection for performers, so that a performance cannot be reproduced by AI technology without the performer’s consent.

Advances in AI, including deepfake technology, have reinforced the urgent need to introduce image rights—also known as personality rights or publicity rights. That refers to

“the expression of a personality in the public domain”,

such as an individual’s name, likeness or other personal indicators. Provision of image rights in law enables performers to safeguard meaningful income streams, and to defend their artistic integrity, career choices, brand and reputation. More broadly, for society, it is an important tool for protecting privacy and allowing an individual to object to the use of their image without consent.

In the UK, there is no codified law of image rights or privacy. Instead, we have a patchwork of statutory and common-law causes of action, which an individual can use to protect various aspects of their image and personality. However, none of that is fit for purpose. Legal provision for image rights can be found around the world, so the Government here can and should do more. For example, some American states recognise the right through their statute, and some others through common law. California has both statutory and common-law strains of authority, which protect slightly different forms of the right.

The Celebrities Rights Act of 1985 was passed in California and extended the personality rights for a celebrity to 70 years after their death. In 2020, New York State passed a Bill that recognised rights of publicity for “deceased performers” and “deceased personalities”. Guernsey has created a statutory regime under which image rights can be registered. The legislation centres on the legal concept of a “personnage”— the person or character behind a personality that is registered. The image right becomes a property right capable of protection under the legislation through registration, which enables the image right to be protected, licensed and assigned.

The Minister will know that Equity is doing incredible work to highlight the genuine impact that this type of technology is having on our creative industry and our performers. He must therefore see the sense in our new clause, which would require the Government at least to consider the matter of synthetic media content, which thus far they have utterly failed to do.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship again, Ms Rees. I thank the shadow Minister, the hon. Member for Pontypridd, for raising the issues that she has done about synthetic and digitally manipulated content, which we are very conscious of. We are conscious of the risk of harm to those who work in the entertainment industry and of course, in particular, to victims of deepfake pornography.

We take intellectual property infringement extremely seriously. The Government have recently published a counter-infringement strategy, setting out a range of steps that we intend to take to strengthen the whole system approach to tackling infringement of intellectual property rights. It is widely acknowledged that the United Kingdom has an intellectual property framework that is genuinely world leading and considered among the best in the world. That includes strong protections for performers’ rights. We intend that to continue. However, we are not complacent and the law is kept under review, not least via the counter-infringement strategy I mentioned a moment ago.

Harmful synthetic media content, including the deepfakes that the hon. Member for Pontypridd mentioned, is robustly addressed by the safety duties set out in the Bill in relation to illegal content—much deepfake content, if it involves creating an image of someone, would be illegal—as well as content that could be harmful to children and content that will be on the “legal but harmful” adult list. Those duties will tackle the most serious and illegal forms of deepfake and will rightly cover certain threats that undermine our democracy. For example, a manipulated media image that contained incitement to violence, such as a deepfake of a politician telling people to attack poll workers because they are rigging an election, would obviously already fall foul of the Bill under the illegal duties.

In terms of reporting and codes of practice, the Bill already requires Ofcom to produce codes of practice setting out the ways in which providers can take steps to reduce the harm arising from illegal and harmful content, which could include synthetic media content such as deepfakes where those contain illegal content.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The Minister uses the example of a deepfake of a politician inciting people to attack poll workers during an election. Given some of the technology is so advanced that it is really difficult to spot when the deepfakes actually occur, could it be argued that Ofcom as regulator or even the platforms themselves would be adverse to removing or reporting the content as it could fall foul of the democratic content exemption in the Bill?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The democratic content protection that the shadow Minister refers to, in clause 15, is not an exemption; it is a duty to take into account content of democratic importance. That is on line 34 of page 14. When making a decision, it has to be taken into account—it is not determinative; it is not as if a politician or somebody involved in an election gets a free pass to say whatever they like, even if it is illegal, and escapes the provisions of the Bill entirely. The platform simply has to take it into account. If it was a deepfake image that was saying such a thing, the balancing consideration in clause 15 would not even apply, because the protection applies to content of democratic importance, not to content being produced by a fake image of a politician.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It is important that we get this right. One of our concerns on clause 15, which we have previously discussed, relates to this discussion of deepfakes, particularly of politicians, and timeframes. I understand the Minister’s point on illegal content. If there is a deepfake of a politician—on the eve of poll, for example—widely spreading disinformation or misinformation on a platform, how can the Minister confidently say that that would be taken seriously, in a timely manner? That could have direct implications on a poll or an election. Would the social media companies have the confidence to take that content down, given clause 15?

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

The protections in clause 15—they are not exemptions—would only apply to content that is of bona fide, genuine democratic importance. Obviously, a deepfake of a politician would not count as genuine, democratic content, because it is fake. If it was a real politician, such as the hon. Lady, it would benefit from that consideration. If it was a fake, it would not, because it would not be genuine content of democratic importance.

It is also worth saying that if—well, I hope when—our work with the Law Commission to review the criminal law related to the non-consensual taking and sharing of internet images is taken forward, that will then flow into the duties in the Bill. Deepfakes of internet images are rightly a concern of many people. That work would fall into the ambit of the Bill, either via clause 52, which points to illegal acts where there is an individual victim, or schedule 7, if a new internet image abuse were added to schedule 7 as a priority offence. There are a number of ways in which deepfakes could fall into the ambit of the Bill, including if they relate to extreme pornography.

The new clause would require the production of a report, not a change to the substantive duties in the Bill. It is worth saying that the Bill already provides Ofcom with powers to produce and publish reports regarding online safety matters. Those powers are set out in clause 137. The Bill will ensure that Ofcom has access to the information required to prepare those reports, including information from providers about the harm caused by deepfakes and how companies tackle the issue. We debated that extensively this morning when we talked about the strong powers that already exist under clause 85.

The hon. Lady has raised important points about intellectual property, and I have pointed to our counter-infringement strategy. She raised important points about deepfakes both in a political context and in the context of especially intimate images being generated by AI. I hope I have set out how the Bill addresses concerns in those areas. The Bill as drafted addresses those important issues in a way that is certainly adequate.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I welcome the Minister’s comments and I am grateful for his reassurance on some of the concerns that were raised. At this stage we will not press the matter to a vote. I beg to ask leave to withdraw the motion.

Clause, by leave, withdrawn.

New Clause 27

OFCOM: power to impose duties on regulated services

“OFCOM: power to impose duties on regulated services

(1) OFCOM may carry out an assessment of the risk of harm posed by any regulated service.

(2) Where OFCOM assess a service to pose a very high risk of harm, OFCOM may, notwithstanding the categorisation of the service or the number or profile of its users, impose upon the service duties equivalent to—

(a) the children’s risk assessment duties set out in sections 10 and 25 of this Act; and

(b) the safety duties protecting children set out in sections 11 and 26 of this Act.”—(Kirsty Blackman.)

This new clause enables Ofcom to impose on any regulated service duties equivalent to the children’s risk assessment duties and the safety duties protecting children.

Brought up, and read the First time.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

This is another attempt to place a higher bar and more requirements on regulated services that are likely to cause the most serious risks of harm. The Minister has consistently said that he is keen to consider regulating the companies and platforms that have the highest potential risk of harm more strictly than the normal regime would allow. Some of the platforms would not be category 1 on the basis that they have a small number of members, but the potential for harm—radicalisation, extremism, severe damage to people or extreme pornography—is very high.

I am not yet happy that the Minister has provided an adequate answer to the question about the regulation of the highest-risk platforms that do not meet the category 1 thresholds. If he is unwilling to accept this amendment or any of the other amendments tabled by the Opposition on this specific issue, I hope that he will give consideration to a Government amendment on Report or when the Bill goes through the House of Lords in order that this loose end can be tied up.

As I have said before—I do not want go too much over comments that I have made previously—it is reasonable for us to have a higher bar and a more strict regulation regime on specific platforms that Ofcom will easily be able to identify and that create the highest harm. Again, as I have said, this is another way of going about it. The new clause suggests that if Ofcom assesses that a service poses a very high risk of harm, it might, notwithstanding the categorisation of that service, require it to perform the children’s risk assessment duties and the safety duties protecting children. This is specifically about the children’s risk assessment.

I have previously raised concerns about not being able to accurately assess the number of child users that a service has. I am still not entirely comfortable that platforms will be able to accurately assess the number of child users they have, and therefore they might not be subject to the child user requirements, because they have underplayed or understated the number of children using their service, or because there are only a few hundred children using the service, which is surely massively concerning for the wellbeing of those few hundred children.

I hope the Minister can give us some comfort that he is not just considering what action to take, but that he will take some sort of action on Report or when the Bill proceeds through the House of Lords.

Barbara Keeley Portrait Barbara Keeley (Worsley and Eccles South) (Lab)
- Hansard - - - Excerpts

It is a pleasure to serve with you in the Chair again, Ms Rees. I rise to speak in support of new clause 27.

We have argued that the Government’s approach to categorising services fails to take account of the harms that could result from smaller services. I understand that a risk-based approach rather than a size-based approach is being considered, and that is welcome. The new clause would go some way to improving the categorisation of services as it stands. It is critical that there are ways for Ofcom to assess companies’ risk of harm to users and to place additional duties on them even when they lie outside the category to which they were initially assigned. Ofcom should be able to consult any organisation that it sees fit to consult, including user advocacy groups and civil society, in assessing whether a service poses

“a very high risk of harm”.

Following that, Ofcom should have powers to deliver the strictest duties on companies that expose adults to the most dangerous harms. That should always be proportionate to the risk of harm.

Labour supports the new clause and the arguments made by the hon. Member for Aberdeen North.

14:15
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Member for Aberdeen North for raising those considerations, because protecting children is clearly one of the most important things that the Bill will do. The first point that it is worth drawing to the Committee’s attention again is the fact that all companies, regardless of the number of child users they may have, including zero child users, have duties to address illegal content where it affects children. That includes child sexual exploitation and abuse content, and illegal suicide content. Those protections for the things that would concern us the most—those illegal things—apply to companies regardless of their size. It is important to keep that in mind as we consider those questions.

It is also worth keeping in mind that we have designed the provisions in clause 31 to be a bit flexible. The child user condition, which is in clause 31(3) on page 31 of the Bill, sets out that one of two tests must be met for the child user condition to be met. The condition is met if

“there is a significant number of children who are users of the service…or…the service…is of a kind likely to attract a significant number of users who are children.”

When we debated the issue previously, we clarified that the word “user” did not mean that they had to be a registered user; they could be somebody who just stumbles across it by accident or who goes to it intentionally, but without actually registering. We have built in a certain amount of flexibility through the word “likely”. That helps a little bit. We expect that where a service poses a very high risk of harm to children, it is likely to meet the test, as children could be attracted to it—it might meet the “likely to attract” test.

New clause 27 would introduce the possibility that even when there were no children on the service and no children were ever likely to use it, the duties would be engaged—these duties are obviously in relation to content that is not illegal; the illegal stuff is covered already elsewhere. There is a question about proportionality that we should bear in mind as we think about this. I will be resisting the new clause on that basis.

However, as the hon. Member for Aberdeen North said, I have hinted or more than hinted to the Committee previously that we have heard the point that has been made—it was made in the context of adults, but applies equally to children here—that there is a category of sites that might have small numbers of users but none the less pose a high risk of harm, not harm that is illegal, because the “illegal” provision applies to everybody already, but harm that falls below the threshold of illegality. On that area, we heard hon. Members’ comments on Second Reading. We have heard what members of the Committee have had to say on that topic as well. I hope that if I say that that is something that we are reflecting on very carefully, the hon. Member for Aberdeen North will understand that those comments have been loudly heard by the Government. I hope that I have explained why I do not think new clause 27 quite works, but the point is understood.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I appreciate the Minister’s comments, but in the drafting of the new clause, we have said that Ofcom “may” impose these duties. I would trust the regulator enough not to impose the child safety duties on a site that literally has no children on it and that children have no ability to access. I would give the regulator greater credit than the Minister did, perhaps accidentally, in his comments. If it were up to Ofcom to make that decision and it had the power to do so where it deemed that appropriate, it would be most appropriate for the regulator to have the duty to make the decision.

I wish to press the new clause to a Division.

Question put, That the clause be read a Second time.

Division 68

Ayes: 7


Labour: 5
Scottish National Party: 2

Noes: 9


Conservative: 9

New Clause 28
Empowerment features for child users
“(1) This section applies where a Part 3 service has empowerment features for adults of a type described in section 14(2).
(2) OFCOM may require a service to provide equivalent features designed specifically for child users.
(3) Where OFCOM places a requirement on a service under subsection (2) it must provide guidance to the service on how to ensure the features are easily accessible and understandable for children.”—(Kirsty Blackman.)
This new clause enables Ofcom to require services to provided empowerment features for child users.
Brought up, and read the First time.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

The new clause attempts to address an asymmetry in the Bill in relation to the lack of user empowerment features for child users. As far as I am aware, there is no requirement for user empowerment functions for child users in the Bill. The new clause would require that if a service has to have user empowerment features in place for adults, then

“OFCOM may require a service to provide equivalent features designed specifically for child users.”

Ofcom would be able then to provide guidance on how those user empowerment features for child users would work.

This provision is especially important for the fairly small number of platforms and providers that are very much aimed at children, and where the vast majority of users are children. We are not talking about Facebook, for example, although if Facebook did have child user empowerment, it would be a good thing. I am thinking about organisations and games such as Roblox, which is about 70% children; Fortnite, although it has quite a lot of adult users too; and Minecraft, which has significant numbers of child users. On those platforms that are aimed at children, not having a child-centred, child-focused user empowerment requirement is an oversight. It is missing from the Bill.

It is important that adults have the ability to make privacy choices about how they use sites and to make choices about some of the content that they can see on a site by navigating the user empowerment functions that exist. But it is also important for children to have that choice. I do not see why adults should be afforded that level of choice and flexibility over the way that they use platforms and the providers that they engage with, but children should not. We are not just talking here about kids who are eight: we are talking about children far older, and for whom adult-centred, adult-written user empowerment functions may not be the best option or as easy to access as ones that are specifically focused on and designed for children.

I have had a discussion with the National Society for the Prevention of Cruelty to Children about the user empowerment functions for child users. We have previously discussed the fact that complaints features have to be understandable by the users of services, so if the Minister is unwilling to accept the new clause, will he give some consideration to what happens when the provider of the platform is marketing that platform to children?

The Roblox website is entirely marketed as a platform for children. It is focused in that way, so will the Minister consider whether Ofcom should be able to require differential user empowerment functions, particularly in cases where the overwhelming majority of users are children? Also, it would not be beyond the wit of man for platforms such as Facebook to have two differential user empowerment functions based on whether somebody is under the age of 18—whether they are a child or an adult—because users tell Facebook their date of birth when signing up. We have talked a lot about age verification and the ways in which that could work.

I would appreciate it if the Minister would consider this important matter. It is something that is lacking at the moment, and we are doing our children a disservice by not providing them with the same functionality that we are providing, or requiring, for adult users.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Labour argued in favour of greater empowerment provisions for children during the debate on new clause 3, which would have brought in a user advocacy body for children. YoungMinds has pointed out that many young people are unaware of the Bill, and there has been little engagement with children regarding its design. I am sure members of the Committee would agree that the complexity of the Bill is evidence enough of that.

New clause 28 would make the online world more accessible for children and increase their control over the content they see. We know that many children use category 1 services, so they should be entitled to the same control over harmful content as adults. As such, Labour supports the new clause.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Member for Aberdeen North for her, as ever, thoughtful comments on the new clause. She has already referred to the user empowerment duties for adults set out in clause 57, and is right to say that those apply only to adults, as is made clear in the very first line of subsection (1) near the bottom of page 52.

As always, the hon. Lady’s analysis of the Bill is correct: the aim of those empowerment duties is to give adults more control over the content they see and the people with whom they interact online. One of the reasons why those empowerment duties have been crafted specifically for adults is that, as we discussed in a freedom of expression context, the Bill does not ultimately censor free speech regarding content that is legal but potentially harmful. Platforms can continue to display that information if their policies allow, so we felt it was right to give adults more choice over whose content they see, given that it could include content that is harmful but falls on the right side of the legal threshold.

As Members would expect, the provisions of the Bill in relation to children are very difficult to the provisions for adults. There are already specific provisions in the Bill that relate to children, requiring all social media companies whose platforms are likely to be accessed by children—not just the big ones—to undertake comprehensive risk assessments and protect children from any kind of harmful activity. If we refer to the children’s risk assessment duties in clause 10, and specifically clause 10(6)(e), we see that those risk assessments include an assessment looking at the content that children will encounter and—critically—who they might encounter online, including adults.

To cut to the chase and explain why user empowerment has been applied to adults but not children, the view was taken that children are already protected a lot more than adults through the child risk assessment duties and child safety duties. Therefore, they do not need the user empowerment provisions because they are already—all of them, regardless of whether they choose to be verified or not—being protected from harmful content already by the much stronger provisions in the Bill relating to children. That is why it was crafted as it is.

14:29
The hon. Lady referred to submissions made by the NSPCC. If they have an argument that advances a different line of reasoning or suggests that what I have just said is in some way flawed, I would be very happy to look at that. She has my email address, and she is very welcome to send that through.
However, on my reading of the Bill as it stands, because of the existing strong protections for children, they do not need to also benefit from the user empowerment duties as set out. Of course, there are also some questions around data protection and safeguarding if children end up self-identifying on a public basis. That is why they are omitted. I hope that makes sense, but I would be happy to read any further submission if she has one.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

It does make sense, and I do understand what the Minister is talking about in relation to clause 10 and the subsections that he mentioned. However, that only sets out what the platforms must take into account in their child risk assessments.

If we are talking about 15-year-olds, they are empowered in their lives to make many decisions on their own behalf, as well as decisions guided by parents or parental decisions taken for them. We are again doing our children a disservice by failing to allow young people the ability to opt out—the ability to choose not to receive certain content. Having a requirement to include whether not these functionalities exist in a risk assessment is very different from giving children and young people the option to choose, and to decide what they do—and especially do not—want to see on whichever platform they are interacting on.

I have previously mentioned the fact that if a young person is on Roblox, or some of those other platforms, it is difficult for them to interact only with people who are on their friends list. It is difficult for that young person to exclude adult users from contacting them. A lot of young people want to exclude content, comments or voice messages from people they do not know. They want to go on the internet and have fun and enjoy themselves without the risk of being sent an inappropriate message or photo and having to deal with those things. If they could choose those empowerment functions, that just eliminates the risk and they can make that choice.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

Could I develop the point I was making earlier on how the Bill currently protects children? Clause 11, which is on page 10, is on safety duties for children—what the companies have to do to protect children. One thing that they may be required by Ofcom to do, as mentioned in subsection (4)(f), is create

“functionalities allowing for control over content that is encountered, especially by children”.

Therefore, there is a facility to require the platforms to create the kind of functionalities that relate actually, as that subsection is drafted, to not just identity but the kind of content being displayed. Does that go some way towards addressing the hon. Lady’s concern?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

That is very helpful. I am glad that the Minister is making clear that he thinks that Ofcom will not just be ignoring this issue because the Bill is written to allow user empowerment functions only for adults.

I hope the fact that the Minister kindly raised clause 11(4) will mean that people can its importance, and that Ofcom will understand it should give consideration to it, because that list of things could have just been lost in the morass of the many, many lists of things in the Bill. I am hoping that the Minister’s comments will go some way on that. Notwithstanding that, I will press the new clause to a vote.

Question put, That the clause be read a Second time.

Division 69

Ayes: 7


Labour: 5
Scottish National Party: 2

Noes: 9


Conservative: 9

New Clause 29
Accessibility to adult users with learning disabilities
“(1) This section applies to the following functions—
(a) any user empowerment features provided under section 14;
(b) any content reporting systems or processes under section 17 or section 27;
(c) any complaints procedure under section 18 or section 28.
(2) The service must, as part of its compliance with any duties under the sections listed in subsection (1), ensure that the functions are accessible and understandable to adult users with learning disabilities.”—(Kirsty Blackman.)
This new clause requires complaints, user empowerment and user reporting functions to be accessible and understandable to adult users with learning disabilities.
Brought up, and read the First time.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

I mentioned this in earlier consideration. The issue was raised with me by Mencap, specifically in relation to the people it represents who have learning disabilities and who have a right to access the internet just as we all do. They should be empowered to use the internet with a level of safety and be able to access complaints, to make content reports and to use the user empowerment functions. Everybody who is likely to use the platforms should be able to access and understand those functions.

Will the Minister make it clear that he expects Ofcom, when drafting guidance about the user empowerment functions and their accessibility, the content reporting and the complaints procedures, to consult people about how those things work? Will he make it clear that he hopes Ofcom will take into account the level of accessibility? This is not just about writing things in plain English—or whatever that campaign is about writing things in a way that people can understand—it is about actually speaking to groups that represent people with learning disabilities to ensure that content reporting, the empowerment functions and the complaints procedures are accessible, easy to find and easy to understand, so that people can make the complaints that they need to make and can access the internet on an equal and equitable basis.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I rise to speak in support of the new clause. Too often people with learning disabilities are left out of discussions about provisions relevant to them. People with learning disabilities are disproportionately affected by online harms and can receive awful abuse online.

At the same time, Mencap has argued that social media platforms enable people with learning disabilities to develop positive friendships and relationships. It is therefore even more important that people with learning disabilities do not lose out on the features described in clause 14, which allow them to control the content to which they are exposed. It is welcome that clauses 17, 18, 27 and 28 specify that reporting and complaints procedures must be easy to access and use.

The Bill, however, should go further to ensure that the duties on complaints and reporting explicitly cater to adults with learning disabilities. In the case of clause 14 on user empowerment functions, it must be made much clearer that those functions are easy to access and use. The new clause would be an important step towards ensuring that the Bill benefits everyone who experiences harms online, including people with learning disabilities. Labour supports the new clause.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Member for Aberdeen North once again for the thoughtfulness with which she has moved her new clause. To speak first to the existing references to accessibility in the Bill, let me start with user empowerment in clause 14.

Clause 14(4) makes it clear that the features included in “a service in compliance” with the duty in this clause must be made available to all adult users. I stress “all” because, by definition, that includes people with learning disabilities or others with characteristics that mean they may require assistance. When it comes to content reporting duties, clause 17(2)—line 6 of page 17—states that it has to be easy for any “affected persons” to report the content. They may be people who are disabled or have a learning difficulty or anything else. Clause 17(6)(d) further makes it clear that adults who are “providing assistance” to another adult are able to raise content reporting issues.

There are references in the Bill to being easy to report and to one adult assisting another. Furthermore, clause 18(2)(c), on page 18, states that the complaints system has to be

“easy to use (including by children)”.

It also makes it clear through the definition of “affected person”, which we have spoken about, that an adult assisting another adult is allowed to make a complaint on behalf of the second adult. Those things have been built into the structure of the Bill.

Furthermore, to answer the question from the hon. Member for Aberdeen North, I am happy to put on record that Ofcom, as a public body, is subject to the public sector equality duty, so by law it must take into account the ways in which people with certain characteristics, such as learning disabilities, may be impacted when performing its duties, including writing the codes of practice for user empowerment, redress and complaints duties. I can confirm, as the hon. Member requested, that Ofcom, when drafting its codes of practice, will have to take accessibility into account. It is not just a question of my confirming that to the Committee; it is a statutory duty under the Equality Act 2010 and the public sector equality duty that flows from it.

I hope that the words of the Bill, combined with that statutory public sector equality duty, make it clear that the objectives of new clause 29 are met.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The Minister mentioned learning difficulties. That is not what we are talking about. Learning difficulties are things such as dyslexia and attention deficit hyperactivity disorder. Learning disabilities are lifelong intellectual impairments and very different things—that is what we are talking about.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am very happy to accept the shadow Minister’s clarification. The way that clauses 14, 17 and 18 are drafted, and the public sector equality duty, include the groups of people she referred to, but I am happy to acknowledge and accept her clarification.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

That is fine, but I have a further point to make. The new clause would be very important to all those people who support people with learning disabilities. So much of the services that people use do not take account of people’s learning disabilities. I have done a huge amount of work to try to support people with learning disabilities over the years. This is a very important issue to me.

There are all kinds of good examples, such as easy-read versions of documents, but the Minister said when batting back this important new clause that the expression “all adult users” includes people with learning disabilities. That is not the case. He may not have worked with a lot of people with learning disabilities, but they are excluded from an awful lot. That is why I support making that clear in the Bill.

We on the Opposition Benches say repeatedly that some things are not included by an all-encompassing grouping. That is certainly the case here. Some things need to be said for themselves, such as violence against women and girls. That is why this is an excellent new clause that we support.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I thank the Minister, particularly for providing the clarification that I asked for about who is likely to be consulted or taken into account when Ofcom is writing the codes of practice. Notwithstanding that, and particularly given the rather excellent speech from the shadow Minister, the hon. Member for Worsley and Eccles South, I am keen to press the new clause to a vote.

Question put, That the clause be read a Second time.

Division 70

Ayes: 7


Labour: 5
Scottish National Party: 2

Noes: 9


Conservative: 9

New Clause 36
Communication offence for encouraging or assisting self-harm
“(1) In the Suicide Act 1961, after section 3 insert—
“3A Communication offence for encouraging or assisting self-harm
(1) A person (“A”) commits an offence if—
(a) A sends a message,
(b) the message encourages or could be used to assist another person (“B”) to inflict serious physical harm upon themselves, and
(c) A’s act was intended to encourage or assist the infliction of serious physical harm.
(2) The person referred to in subsection (1)(b) need not be a specific person (or class of persons) known to, or identified by, A.
(3) A may commit an offence under this section whether or not any person causes serious physical harm to themselves, or attempts to do so.
(4) A person guilty of an offence under this section is liable—
(a) on summary conviction, to imprisonment for a term not exceeding 12 months, or a fine, or both;
(b) on indictment, to imprisonment for a term not exceeding 5 years, or a fine, or both.
(5) “Serious physical harm” means serious injury amounting to grievous bodily harm within the meaning of the Offences Against the Person Act 1861.
(6) No proceedings shall be instituted for an offence under this section except by or with the consent of the Director of Public Prosecutions.
(7) If A arranges for a person (“A2”) to do an Act and A2 does that Act, A is also to be treated as having done that Act for the purposes of subsection (1).
(8) In proceedings for an offence to which this section applies, it shall be a defence for A to prove that—
(a) B had expressed intention to inflict serious physical harm upon themselves prior to them receiving the message from A;
(b) B’s intention to inflict serious physical harm upon themselves was not initiated by A; and
(c) the message was wholly motivated by compassion towards B or to promote the interests of B’s health or wellbeing.””—(Kirsty Blackman.)
14:45
Brought up, and read the First time.
Question put, That the clause be read a Second time.

Division 71

Ayes: 7


Labour: 5
Scottish National Party: 2

Noes: 9


Conservative: 9

New Clause 37
The Digital Regulation Committee
“(1) There shall be a Committee, to be known as the Digital Regulation Committee and in this section referred to as “the Committee”, to undertake the following functions in connection with the provisions of this Act—
(a) to review all codes of practice and any other relevant publication produced by OFCOM; and
(b) to monitor and report on any other matter relevant to the functioning of this Act.
(2) The Committee may publish reports in connection with its activities under subsection (1).
(3) The Secretary of State must—
(a) respond to the recommendations contained in any report by the Committee within three months; and
(b) publish and lay copies of their response in both Houses of Parliament.
(4) The Committee shall consist of twelve members—
(a) who shall be drawn from both the House of Commons and from members of the House of Lords; and
(b) none of whom shall be a Minister of the Crown.
(5) The membership and Chair of the Committee shall be appointed by regulations made by the Secretary of State.
(6) Details of the tenure of office of members of, the procedure of and other matters relating to, the Committee shall be set out in regulations made by the Secretary of State.
(7) A statutory instrument containing regulations under this section may not be made unless a draft of the instrument has been laid before and approved by resolution of each House of Parliament.”—(Kirsty Blackman.)
Brought up, and read the First time.
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

I drafted this new clause following a number of conversations and debates that we had in Committee about how the Act will be scrutinised. How will we see whether the Act is properly achieving what it is supposed to achieve? We know that there is currently a requirement in the Bill for a review to take place but, as has been mentioned already, that is a one-off thing; it is not a rolling update on the efficacy of the Act and whether it is achieving the duties that it is supposed to achieve.

This is particularly important because there are abilities for the Secretary of State to make changes to some of the Act. Presumably the Government would not have put that in if they did not think there was a possibility or a likelihood that changes would have to be made to the Act at some future point. The Bill is certainly not perfect, but even from the Government’s point of view it is not perfect for all time. There is a requirement for the Act to be updated; it will have to change. New priority harms may have to be added. New details about different illegal acts may have to be added to the duties. That flexibility is given, and the Secretary of State has that flexibility in a number of cases.

If the Act were just going to be a standing thing, if it were not going to be updated, it would never be future-proof; it would never work in the changing world that we have. We know that this legislation has taken a very long time to get here. We have been sadly lacking in significant regulation in the online world for more than 20 years, certainly. For a very long time we have not had this. Now that the Act is here—or it will be once the Bill passes through both Houses of Parliament—we want it to work.

That is the point of every amendment we have tabled: we are trying to make the Bill better so that it works and can keep people as safe as possible. At the moment, we do not know how safe the internet will be as a result of the Bill. Even once it begins to be implemented, we will not have enough information on the improvements it has created to be able to say, “Actually, this was a world-leading piece of legislation.”

It may be that the digital regulation committee that I am suggesting in this new clause has a look regularly at the implementation of the Bill going forward and says, “Yep, that’s brilliant.” The committee might look at the implementation and the increasing time we spend online, with all the harms that can come with that, and says, “Actually, you need to tweak that a bit” or, “That is not quite fulfilling what it was intended to.” The committee might also say, “This brand new technology has come in and it is not entirely covered by the Act as it is being implemented.” A digital regulation committee was proposed by the Joint Committee, I think, to scrutinise implementation of the legislation.

The Government will say that they will review—they always do. I have been in so many Delegated Legislation Committees that involve the Treasury and the Government saying, “Yes, we keep everything under review—we always review everything.” That line is used in so many of these Committees, but it is just not true. In January I asked the Department for Digital, Culture, Media and Sport

“how many and what proportion of (a) primary and (b) secondary legislation sponsored by (i) their Department…has undergone a post legislative review”.

It was a written question I put to a number of Departments including DCMS. The reply I got from the Minister here was:

“The number of post legislative reviews the Department has undertaken on primary and secondary legislation in each of the last five years is not held within the Department.”

The Government do not even know how many pieces of primary or secondary legislation they have reviewed. They cannot tell us that all of them have been reviewed. Presumably, if they could tell us that all of them have been reviewed, the answer to my written question would have been, “All of them.” I have a list of the number they sponsored. It was six in 2021, for example. If the Department had reviewed the implementation of all those pieces of legislation, I would expect it to be shouting that from the rooftops in response to a written question. It should be saying, “Yes, we are wonderful. We have reviewed all these and found that most of them are working exactly as we intended them to.”

I do not have faith in the Government or in DCMS—nor pretty much in any Government Department. I do not have faith in their ability or intention to adequately and effectively review the implementation of this legislation, to ensure that the review is done timeously and sent to the Digital, Culture, Media and Sport Committee, or to ensure those proper processes that are supposed to be in place are actually in place and that the Bill is working.

It is unfortunate for the Minister that he sent me that reply earlier in the year, but I only asked the question because I was aware of the significant lack of work the Government are doing on reviewing whether or not legislation has achieved its desired effect, including whether it has cost the amount of money they said it would, whether it has kept the amount of people safe that they said it would, and that it has done what it needs to do.

I have a lack of faith in the Government generally, but specifically on this issue because of the shifting nature of the internet. This is not to take away from the DCMS Committee, but I have sat on a number of Select Committees and know that they are very busy—they have a huge amount of things to scrutinise. This would not stop them scrutinising this Act and taking action to look at whether it is working. It would give an additional line of scrutiny, transparency and defence, in order to ensure that this world-leading legislation is actually world-leading and keeps people safe in the way it is intended to.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It is an honour to support the new clause moved by the hon. Member for Aberdeen North. This was a recommendation from the Joint Committee report, and we believe it is important, given the sheer complexity of the Bill. The Minister will not be alarmed to hear that I am all in favour of increasing the scrutiny and transparency of this legislation.

Having proudly served on the DCMS Committee, I know it does some excellent work on a very broad range of policy areas, as has been highlighted. It is important to acknowledge that there will of course be cross-over, but ultimately we support the new clause. Given my very fond memories of serving on the Select Committee, I want to put on the record my support for it. My support for this new clause is not meant as any disrespect to that Committee. It is genuinely extremely effective in scrutinising the Government and holding them to account, and I know it will continue to do that in relation to both this Bill and other aspects of DCMS. The need for transparency, openness and scrutiny of this Bill is fundamental if it is truly to be world-leading, which is why we support the new clause.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I am grateful for the opportunity to discuss this issue once again. I want to put on the record my thanks to the Joint Committee, which the hon. Member for Ochil and South Perthshire sat on, for doing such fantastic work in scrutinising the draft legislation. As a result of its work, no fewer than 66 changes were made to the Bill, so it was very effective.

I want to make one or two observations about scrutinising the legislation following the passage of the Bill. First, there is the standard review mechanism in clause 149, on pages 125 and 126, which provides for a statutory review not before two years and not after five years of the Bill receiving Royal Assent.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

On that review function, it would help if the Minister could explain a bit more why it was decided to do that as a one-off, and not on a rolling two-year basis, for example.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

That is a fairly standard clause in legislation. Clearly, for most legislation and most areas of Government activity, the relevant departmental Select Committee would be expected to provide the ongoing scrutiny, so ordinarily the DCMS Committee would do that. I hear the shadow Minister’s comments: she said that this proposal is not designed in any way to impugn or disrespect that Committee, but I listened to the comments of the Chair of that Committee on Second Reading, and I am not sure he entirely shares that view—he expressed himself in quite forthright terms.

On the proposal, we understand that the Joint Committee did valuable work. This is an unusual piece of legislation, in that it is completely groundbreaking. It is unlike any other, so the case for a having a particular Committee look at it may have some merits. I am not in a position to give a definitive Government response to that because the matter is still under consideration, but if we were to establish a special Committee to look at a single piece of legislation, there are two ways to do it. It could either be done in statute, as the new clause seeks, or it could be done by Standing Orders.

Generally speaking, it is the practice of the House to establish Committees by Standing Orders of the House rather than by statute. In fact, I think the only current Committee of the House established by statute—Ms Rees, you will correct me if I am wrong, as you are more of an expert on these matters than me—is the Intelligence and Security Committee, which was established by the Intelligence Services Act 1994. That is obviously very unusual, because it has special powers. It looks into material that would ordinarily be classified as secret, and it has access to the intelligence services. It is a rather unusual Committee that has to be granted special powers because it looks into intelligence and security matters. Clearly, those considerations do not apply here. Were a particular Committee to be established, the right way of doing that would not be in statute, as the new clause proposes, but via the Standing Orders of the House, if that is something that Parliament wants to do.

Dean Russell Portrait Dean Russell (Watford) (Con)
- Hansard - - - Excerpts

As another member of the Joint Committee, I totally understand the reasoning. I want to put on record my support for setting up a Committee through the approach the Minister mentioned using statutory instruments. I will not support the new clause but I strongly support the Joint Committee continuing in some form to enable scrutiny. When we look forward to the metaverse, virtual reality and all the things that are coming, it is important that that scrutiny continues. No offence to Opposition colleagues, but I do not think the new clause is the right way to do that. However, the subject is worth further exploration, and I would be very supportive of that happening.

15:00
Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

First, let me also put on record my thanks to my hon. Friend for his service on the Joint Committee. He did a fantastic job and, as I said, the Committee’s recommendations have been powerfully heard. I thank him for his acknowledgment that if one were to do this, the right way to do it would be through Standing Orders. I have heard the point he made in support of some sort of ongoing special committee. As I say, the Government have not reached a view on this, but if one were to do that, I agree with my hon. Friend that Standing Orders would be the right mechanism.

One of the reasons for that can be found in the way the new clause has been drafted. Subsections (5) and (6) say:

“The membership and Chair of the Committee shall be appointed by regulations made by the Secretary of State…the tenure of office of members of, the procedure of and other matters…shall be set out in regulations made by the Secretary of State.”

I know those regulations are then subject to approval by a resolution of the House, but given the reservations expressed by Opposition Members about powers for the Secretary of State over the last eight sitting days, it is surprising to see the new clause handing the Secretary of State—in the form of a regulation-making power—the power to form the Committee.

That underlines why doing this through Standing Orders, so that the matter is in the hands of the whole House, is the right way to proceed, if that is something we collectively wish to do. For that reason, we will not support the new clause. Obviously, we will get back to the House in due course once thinking has been done about potential Committees, but that can be done as a separate process to the legislation. In any case, post-legislative scrutiny will not be needed until the regime is up and running, which will be after Royal Assent, so that does not have enormous time pressure on it.

A comment was made about future-proofing the Bill and making sure it stays up to date. There is a lot in that, and we need to make sure we keep up to date with changing technologies, but the Bill is designed to be tech agnostic, so if there is change in technology, that is accommodated by the Bill because the duties are not specific to any given technology. A good example is the metaverse. That was not conceived or invented prior to the Bill being drafted; none the less, it is captured by the Bill. The architecture of the Bill, relying on codes of practice produced by Ofcom, is designed to ensure flexibility so that the codes of practice can be kept up to date. I just wanted to make those two points in passing, as the issue was raised by the hon. Member for Aberdeen North.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The reason the new clause is drafted in that way is because I wanted to recognise the work of the Joint Committee and to take on board its recommendations. If it had been entirely my drafting, the House of Lords would certainly not have been involved, given that I am not the biggest fan of the House of Lords, as its Members are not elected. However, the decision was made to submit the new clause as drafted.

The Minister has said that the Government have not come to a settled view yet, which I am taking as the Minister not saying no. He is not standing up and saying, “No, we will definitely not have a Standing Committee.” I am not suggesting he is saying yes, but given that he is not saying no, I am happy to withdraw the new clause. If the Minister is keen to come forward at a future stage with suggestions for changes to Standing Orders, which I understand have to be introduced by the Leader of the House or the Cabinet Office, then they would be gladly heard on this side of the House. I beg to ask leave to withdraw the motion.

Clause, by leave, withdrawn.

New Clause 38

Adults’ risk assessment duties

“(1) This section sets out duties which apply in relation to internet services within section 67(2).

(2) A duty to take appropriate steps to keep an adults’ risk assessment up to date, including when OFCOM makes any significant change to a risk profile that relates to services of the kind in question.

(3) Before making any significant change to any aspect of a service’s design or operation, a duty to carry out a further suitable and sufficient adults’ risk assessment relating to the impacts of that proposed change.

(4) A duty to make and keep a written record, in an easily understandable form, of every risk assessment under subsections (2) and (3).

(5) An “adults’ risk assessment” of a service of a particular kind means an assessment of the following matters, taking into account the risk profile that relates to services of that kind—

(a) the user base;

(b) the level of risk of adults who are users of the service encountering, by means of the service, each kind of priority content that is harmful to adults (with each kind separately assessed).

(6) An “adults’ risk assessment” of a service of a particular kind means an assessment of the following matters, taking into account the risk profile that relates to services of that kind—

(a) the user base;

(b) the level of risk of adults who are users of the service encountering, by means of the service, each kind of priority content that is harmful to adults (with each kind separately assessed), taking into account (in particular) algorithms used by the service, and how easily, quickly and widely content may be disseminated by means of the service;

(c) the level of risk of harm to adults presented by different kinds of priority content that is harmful to adults;

(d) the level of risk of harm to adults presented by priority content that is harmful to adults which particularly affects individuals with a certain characteristic or members of a certain group;

(e) the level of risk of functionalities of the service facilitating the presence or dissemination of priority content that is harmful to adults, identifying and assessing those functionalities that present higher levels of risk;

(f) the different ways in which the service is used, and the impact of such use on the level of risk of harm that might be suffered by adults;

(g) the nature, and severity, of the harm that might be suffered by adults from the matters identified in accordance with paragraphs (b) to (f);

(h) how the design and operation of the service (including the business model, governance, use of proactive technology, measures to promote users’ media literacy and safe use of the service, and other systems and processes) may reduce or increase the risks identified.

(7) In this section references to risk profiles are to the risk profiles for the time being published under section 83 which relate to the risk of harm to adults presented by priority content that is harmful to adults.

(8) The provisions of Schedule 3 apply to any assessment carried out under this section in the same way they apply to any relating to a Part 3 service.”—(John Nicolson.)

This new clause applies adults’ risk assessment duties to pornographic sites.

Brought up, and read the First time.

John Nicolson Portrait John Nicolson (Ochil and South Perthshire) (SNP)
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

New clause 39—Safety duties protecting adults—

“(1) This section sets out duties which apply in relation to internet services within section 67(2).

(2) A duty to summarise in the terms of service the findings of the most recent adults’ risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to adults).

(3) A duty to include provisions in the terms of service specifying, in relation to each kind of priority content that is harmful to adults that is to be treated in a way described in subsection (3), which of those kinds of treatment is to be applied.

(4) These are the kinds of treatment of content referred to in subsection (3)—

(a) taking down the content;

(b) restricting users’ access to the content.

(5) A duty to explain in the terms of service the provider’s response to the risks relating to priority content that is harmful to adults (as identified in the most recent adults’ risk assessment of the service), by reference to—

(a) any provisions of the terms of service included in compliance with the duty set out in subsection (3), and

(b) any other provisions of the terms of service designed to mitigate or manage those risks.

(6) If provisions are included in the terms of service in compliance with the duty set out in subsection (3), a duty to ensure that those provisions—

(a) are clear and accessible, and

(b) are applied consistently in relation to content which the provider reasonably considers is priority content that is harmful to adults or a particular kind of priority content that is harmful to adults.

(7) If the provider of a service becomes aware of any non-designated content that is harmful to adults present on the service, a duty to notify OFCOM of—

(a) the kinds of such content identified, and

(b) the incidence of those kinds of content on the service.

(8) In this section—

“adults’ risk assessment” has the meaning given by section 12;

“non-designated content that is harmful to adults” means content that is harmful to adults other than priority content that is harmful to adults.”

This new clause applies safety duties protecting adults to regulated provider pornographic content.

New clause 40—Duties to prevent users from encountering illegal content—

“(1) This section sets out duties which apply in relation to internet services within section 67(2).

(2) A duty to operate an internet service using proportionate systems and processes designed to—

(a) prevent individuals from encountering priority illegal content that amounts to an offence in either Schedule 6 or paragraphs 17 and 18 of Schedule 7 by means of the service;

(b) minimise the length of time for which the priority illegal content referred to in subsection (a) is present;

(c) where the provider is alerted by a person to the presence of the illegal content referred to in subsection (a), or becomes aware of it in any other way, swiftly take down such content.

(3) A duty to operate systems and processes that—

(a) verify the identity and age of all persons depicted in the content;

(b) obtain and keep on record written consent from all persons depicted in the content;

(c) only permit content uploads from verified content providers and must have a robust process for verifying the age and identity of the content provider;

(d) all uploaded content must be reviewed before publication to ensure that the content is not illegal and does not otherwise violate its terms of service;

(e) unloaded content must not be marketed by content search terms that give the impression that the content contains child exploitation materials or the depiction of non–consensual activities;

(f) the service must offer the ability for any person depicted in the content to appeal to remove the content in question.”

This new clause applies duties to prevent users from encountering illegal content to regulated providers of pornographic content.

John Nicolson Portrait John Nicolson
- Hansard - - - Excerpts

Big porn, or the global online pornography industry, is a proven driver of big harms. It causes the spread of image-based sexual abuse and child sexual abuse material. It normalises sexual violence and harmful sexual attitudes and behaviours, and it offers children easy access to violent, sexist and racist sexual content, which is proven to cause them a whole range of harms. In part, the Government recognised how harmful pornography can be to children by building one small aspect of pornography regulation into the Bill.

The Bill is our best chance to regulate the online pornography industry, which it currently does not mention. Over two decades, the porn industry has shown itself not to be trustworthy about regulating itself. Vanessa Morse, the head of the Centre to End All Sexual Exploitation, said:

“If we fail to see the porn industry as it really is, efforts to regulate will flounder.”

If the Minister has not yet read CEASE’s “Expose Big Porn” report, I recommend that he does so. The report details some of the harrowing harms that are proliferated by porn companies. Importantly, these harms are being done with almost zero scrutiny. We all know who the head of Meta or the chief executive officer of Google is, but can the Minister tell me who is in charge of MindGeek? This company dominates the market, yet it is almost completely anonymous—or at least the high heid yins of the company are.

New clause 38 seeks to identify pornography websites as providers of category 1 services, introduce a relevant code of practice and designate a specific regulator, in order to ensure compliance. Big porn must be made to stop hosting illegal extreme porn and the legal but harmful content prohibited by its own terms of service. If anyone thought that social media platforms were indifferent to a harm taking place on their site, they pale in comparison with porn sites, which will do the absolute minimum that they can. To show the extent of the horrible searches allowed, one video found by CEASE was titled “Oriental slave girl tortured”. I will not read out some of the other titles in the report, but there are search terms that promote non-consensual activity, violence, incest and racial slurs. For example, “Ebony slave girl” is a permitted term. This is just one of the many examples of damaging content on porn sites, which are perpetuating horrific sexual practices that, sadly, are too often being viewed by children.

Over 80% of the UK public would support strict new porn laws. I really think there is an appetite among the public to introduce such laws. The UK Government must not pass up this opportunity to regulate big porn, which is long overdue.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

As we heard from the hon. Member for Ochil and South Perthshire, new clauses 38 to 40 would align the duties on pornographic content so that both user-to-user sites and published pornography sites are subject to robust duties that are relevant to the service. Charities have expressed concerns that many pornography sites might slip through the net because their content does not fall under the definition of “pornographic content” in clause 66. The new clauses aim to address that. They are based on the duties placed on category 1 services, but they recognise the unique harms that can be caused by pornographic content providers, some of which the hon. Member graphically described with the titles that he gave. The new clauses also contain some important new duties that are not currently in the Bill, including the transparency arrangements in new clause 39 and important safeguards in new clause 40.

The Opposition have argued time and again for publishing duties when it comes to risk assessments. New clause 39 would introduce a duty to summarise in the terms of service the findings of the most recent adult risk assessments of a service. That is an important step towards making risk assessments publicly accessible, although Labour’s preference would be for them to be published publicly and in full, as I argued in the debate on new clause 9, which addressed category 1 service risk assessments.

New clause 40 would introduce measures to prevent the upload of illegal content, such as by allowing content uploads only from verified content providers, and by requiring all uploaded content to be reviewed. If the latter duty were accepted, there would need to be proper training and support for any human content moderators. We have heard during previous debates about the awful circumstances of human content moderators. They are put under such pressure for that low-paid work, and we do not want to encourage that.

New clause 40 would also provide protections for those featured in such content, including the need for written consent and identity and age verification. Those are important safeguards that the Labour party supports. I hope the Minister will consider them.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I thank the hon. Member for Ochil and South Perthshire for raising these issues with the Committee. It is important first to make it clear that websites providing user-to-user services are covered in part 3 of the Bill, under which they are obliged to protect children and prevent illegal content, including some forms of extreme pornography, from circulating. Such websites are also obliged to prevent children from accessing those services. For user-to-user sites, those matters are all comprehensively covered in part 3.

New clauses 38, 39 and 40 seek to widen the scope of part 5 of the Bill, which applies specifically to commercial pornography sites. Those are a different part of the market. Part 5 is designed to close a loophole in the original draft of the Bill that was identified by the Joint Committee, on which the hon. Member for Ochil and South Perthshire and my hon. Friend the Member for Watford served. Protecting children from pornographic content on commercial porn sites had been wrongly omitted from the original draft of the Bill. Part 5 of the Bill as currently drafted is designed to remedy that oversight. That is why the duties in part 5 are narrowly targeted at protecting children in the commercial part of the market.

A much wider range of duties is placed by part 3 on the user-to-user part of the pornography market. The user-to-user services covered by part 3 are likely to include the largest sites with the least control; as the content is user generated, there is no organising mind—whatever gets put up, gets put up. It is worth drawing the distinction between the services covered in part 3 and part 5 of the Bill.

In relation to part 5 services publishing their own material, Parliament can legislate, if it chooses to, to make some of that content illegal, as it has done in some areas—some forms of extreme pornography are illegal. If Parliament thinks that the line is drawn in the wrong place and need to be moved, it can legislate to move that line as part of the general legislation in this area.

I emphasise most strongly that user-to-user sites, which are probably what the hon. Member for Ochil and South Perthshire was mostly referring to, are comprehensively covered by the duties in part 3. The purpose of part 5, which was a response to the Joint Committee’s report, is simply to stop children viewing such content. That is why the Bill has been constructed as it has.

Question put, That the clause be read a Second time.

Division 72

Ayes: 6


Labour: 4
Scottish National Party: 2

Noes: 9


Conservative: 9

15:15
New Clause 39
Safety duties protecting adults
“(1) This section sets out duties which apply in relation to internet services within section 67(2).
(2) A duty to summarise in the terms of service the findings of the most recent adults’ risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to adults).
(3) A duty to include provisions in the terms of service specifying, in relation to each kind of priority content that is harmful to adults that is to be treated in a way described in subsection (3), which of those kinds of treatment is to be applied.
(4) These are the kinds of treatment of content referred to in subsection (3)—
(a) taking down the content;
(b) restricting users’ access to the content.
(5) A duty to explain in the terms of service the provider’s response to the risks relating to priority content that is harmful to adults (as identified in the most recent adults’ risk assessment of the service), by reference to—
(a) any provisions of the terms of service included in compliance with the duty set out in subsection (3), and
(b) any other provisions of the terms of service designed to mitigate or manage those risks.
(6) If provisions are included in the terms of service in compliance with the duty set out in subsection (3), a duty to ensure that those provisions—
(a) are clear and accessible, and
(b) are applied consistently in relation to content which the provider reasonably considers is priority content that is harmful to adults or a particular kind of priority content that is harmful to adults.
(7) If the provider of a service becomes aware of any non-designated content that is harmful to adults present on the service, a duty to notify OFCOM of—
(a) the kinds of such content identified, and
(b) the incidence of those kinds of content on the service.
(8) In this section—
‘adults’ risk assessment’ has the meaning given by section 12;
‘non-designated content that is harmful to adults’ means content that is harmful to adults other than priority content that is harmful to adults.”—(John Nicolson.)
This new clause applies safety duties protecting adults to regulated provider pornographic content.
Brought up, and read the First time.
Question put, That the clause be read a Second time.

Division 73

Ayes: 6


Labour: 4
Scottish National Party: 2

Noes: 9


Conservative: 9

New Clause 40
Duties to prevent users from encountering illegal content
“(1) This section sets out duties which apply in relation to internet services within section 67(2).
(2) A duty to operate an internet service using proportionate systems and processes designed to—
(a) prevent individuals from encountering priority illegal content that amounts to an offence in either Schedule 6 or paragraphs 17 and 18 of Schedule 7 by means of the service;
(b) minimise the length of time for which the priority illegal content referred to in subsection (a) is present;
(c) where the provider is alerted by a person to the presence of the illegal content referred to in subsection (a), or becomes aware of it in any other way, swiftly take down such content.
(3) A duty to operate systems and processes that—
(a) verify the identity and age of all persons depicted in the content;
(b) obtain and keep on record written consent from all persons depicted in the content;
(c) only permit content uploads from verified content providers and must have a robust process for verifying the age and identity of the content provider;
(d) all uploaded content must be reviewed before publication to ensure that the content is not illegal and does not otherwise violate its terms of service;
(e) unloaded content must not be marketed by content search terms that give the impression that the content contains child exploitation materials or the depiction of non–consensual activities;
(f) the service must offer the ability for any person depicted in the content to appeal to remove the content in question.”—(John Nicolson.)
This new clause applies duties to prevent users from encountering illegal content to regulated providers of pornographic content.
Brought up, and read the First time.
Question put, That the clause be read a Second time.

Division 74

Ayes: 6


Labour: 4
Scottish National Party: 2

Noes: 9


Conservative: 9

New Clause 41
Co-operation and disclosure of information: UK regulators
“(1) OFCOM may co-operate with a regulator established by statute or a recognised self-regulatory body in the United Kingdom, including by disclosing online safety information to that regulator, for the purposes of—
(a) tackling harm arising from illegal content, primary priority content harmful to children, priority content harmful to children, or priority content that is harmful to adults, or
(b) criminal investigations or proceedings relating to a matter to which the regulator’s functions relate.
(2) Where information is disclosed to a person in reliance on subsection (1), the person may not—
(a) use the information for a purpose other than the purpose for which it was disclosed, or
(b) further disclose the information, except with OFCOM’s consent (which may be general or specific) or in accordance with an order of a court or tribunal.
(3) A disclosure of information under subsection (1) does not breach—
(a) any obligation of confidence owed by the person making the disclosure, or
(b) any other restriction on the disclosure of information.”—(Alex Davies-Jones.)
This new clause would give Ofcom the power to co-operate with other regulators for the purposes of tackling harm from illegal content and criminal investigations and proceedings.
Brought up, and read the First time.
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

The new clause would give Ofcom the power to co-operate with other regulators for the purposes of tackling harm from illegal content, and for criminal investigations and proceedings. The Minister will be aware that the vast range of human and business activity covered online presents a complex map of potential harms. Some harms will fall into or be adjacent to the purview of other regulators with domain-specific expertise. The relationship formalised through the Digital Regulation Cooperation Forum is well known. Indeed, Ofcom already has a working relationship with the Advertising Standards Authority and the Internet Watch Foundation, among others. Within this regulatory web, Ofcom will have the most relevant powers and expertise, so many regulators will look to it for help in tackling online safety issues. The Minister must recognise that public protection will most effectively be achieved through regulatory interlock. To protect people, Ofcom should be empowered to co-operate with others and to share information. The Bill should, therefore, as much as it can, enable Ofcom to work with other regulators and share online safety information with them.

Ofcom should also be able to bring the immense skills of other regulators into its work. The Bill gives Ofcom the general ability to co-operate with overseas regulators, but, with the exception of references to consulting the Information Commissioner’s Office when drawing up codes of practice and various items of guidance, the Bill is largely silent on co-operation with UK regulators.

The Communications Act 2003 limits the UK regulators with which Ofcom can share information—excluding the ICO, for instance—yet the Online Safety Bill takes a permissive approach to overseas regulators. The Bill should extend co-operation and information sharing in respect of online safety to include regulators overseeing the offences in schedule 7, the primary priority and priority harms to children, and the priority harms to adults.

Elsewhere in regulation, the Financial Conduct Authority has a general duty to co-operate. The same should apply here. Increasing safety through co-operation between relevant regulators is most easily achieved through our new clause, which will allow Ofcom to co-operate more widely. That is limited to co-operation in respect of harmful illegal content, harms to children and priority harms to adults. It is implicit that Ofcom will share information only with the regulators responsible for those precise matters. We have spoken frequently about the importance of co-operation, collaboration and consultation. This simple new clause would help to remedy the slight limitations placed on Ofcom in the Bill.

Ms Rees, with your permission, at this point—because this is likely to be my last contribution to the Bill Committee—[Interruption.] For shame. I place on record my sincere thanks to you and Sir Roger for chairing these Committee sittings, as well as all the Hansard staff, the Clerks, the Table Office, our civil servants, the Doorkeepers, the tech staff and broadcasting team who enable our proceedings to be broadcast to the public, and all members of the Committee for allowing great scrutiny of this legislation to take place. I look forward to continuing that scrutiny on Report.

None Portrait The Chair
- Hansard -

Thank you.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I will take this opportunity, as my hon. Friend has done, to add a few words of thanks. She has already thanked all the people in this place who we should be thanking, including the Clerks, who have done a remarkable job over the course of our deliberations with advice, drafting, and support to the Chair. I also thank the stakeholder organisations. This Bill is uniquely one in which the stakeholders—the children’s charities and all those other organisations—have played an incredible part. I know from meetings that they have already advertised that those organisations will continue playing that part over the coming weeks, up until Report. It has been fantastic.

Finally, I will mention two people who have done a remarkable amount of work: my researcher Iona and my hon. Friend’s researcher Freddie, who have done a huge amount to help us prepare speaking notes. It is a big task, because this is a complex Bill. I add my thanks to you, Ms Rees, for the way you have chaired this Committee. Please thank Sir Roger on our behalf as well.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Seeing as we are not doing spurious points of order, I will also take the opportunity to express our thanks. The first one is to the Chairs: thank you very much, Ms Rees and Sir Roger, for the excellent work you have done in the Chair. This has been a very long Bill, and the fact that you have put up with us for so long has been very much appreciated.

I thank all the MPs on the Committee, particularly the Labour Front-Bench team and those who have been speaking for the Labour party. They have been very passionate and have tabled really helpful amendments—it has been very good to work with the Labour team on the amendments that we have put together, particularly the ones we have managed to agree on, which is the vast majority. We thank Matt Miller, who works for my hon. Friend the Member for Ochil and South Perthshire. He has been absolutely wonderful. He has done an outstanding amount of work on the Bill, and the amazing support that he has given us has been greatly appreciated. I also thank the Public Bill Office, especially for putting up with the many, many amendments we submitted, and for giving us a huge amount of advice on them.

Lastly, I thank the hundreds of organisations that got in touch with us, and the many people who took the time to scrutinise the Bill, raise their concerns, and bring those concerns to us. Of those hundreds of people and organisations, I particularly highlight the work of the National Society for the Prevention of Cruelty to Children. Its staff have been really helpful to work with, and I have very much appreciated their advice and support in drafting our amendments.

Chris Philp Portrait Chris Philp
- Hansard - - - Excerpts

I feel slightly out of place, but I will add some concluding remarks in a moment; I should probably first respond to the substance of the new clause. The power to co-operate with other regulators and share information is, of course, important, but I am pleased to confirm that it is already in the Bill—it is not the first time that I have said that, is it?

Clause 98 amends section 393(2)(a) of the Communications Act 2003. That allows Ofcom to disclose information and co-operate with other regulators. Our amendment will widen the scope of the provision to include carrying out the functions set out in the Bill.

The list of organisations with which Ofcom can share information includes a number of UK regulators—the Competition and Markets Authority, the Information Commissioner, the Financial Conduct Authority and the Payment Systems Regulator—but that list can be amended, via secondary legislation, if it becomes necessary to add further organisations. In the extremely unlikely event that anybody wants to look it up, that power is set out in subsections (3)(i) and (4)(c) of section 393 of the Communications Act 2003. As the power is already created by clause 98, I hope that we will not need to vote on new clause 41.

I echo the comments of the shadow Minister about the Digital Regulation Cooperation Forum. It is a non-statutory body, but it is extremely important that regulators in the digital arena co-operate with one another and co-ordinate their activities. I am sure that we all strongly encourage the relevant regulators to work with the DRCF and to co-operate in this and adjacent fields.

I will bring my remarks to a close with one or two words of thanks. Let me start by thanking Committee members for their patience and dedication over the nine days we have been sitting—50-odd hours in total. I think it is fair to say that we have given the Bill thorough consideration, and of course there is more to come on Report, and that is before we even get to the House of Lords. This is the sixth Bill that I have taken through Committee as Minister, and it is by far the most complicated and comprehensive, running to 194 clauses and 15 schedules, across 213 pages. It has certainly been a labour. Given its complexity, the level of scrutiny it has received has been impressive—sometimes onerous, from my point of view.

The prize for the most perceptive observation during our proceedings definitely goes to the hon. Member for Aberdeen North, who noticed an inconsistency between use of the word “aural” in clause 49 and “oral” in clause 189, about 120 pages later.

I certainly thank our fantastic Chairs, Sir Roger Gale and Ms Rees, who have chaired our proceedings magnificently and kept us in order, and even allowed us to finish a little early, so huge thanks to them. I also thank the Committee Clerks for running everything so smoothly and efficiently, the Hansard reporters for deciphering our sometimes near-indecipherable utterances, and the Officers of the House for keeping our sittings running smoothly and safely.

I also thank all those stakeholders who have offered us their opinions; I suspect that they will continue to do so during the rest of the passage of the Bill. Their engagement has been important and very welcome. It has really brought external views into Parliament, which is really important.

I conclude by thanking the people who have been working on the Bill the longest and hardest: the civil servants in the Department for Digital, Culture, Media and Sport. Some members of the team have been working on the Bill in its various forms, including White Papers and so on, for as long as five years. The Bill has had a long gestation. Over the last few months, as we have been updating the Bill, rushing to introduce it, and perhaps even preparing some amendments for Report, they have been working incredibly hard, so I give a huge thanks to Sarah Connolly and the whole team at DCMS for all their incredible work.

Finally, as we look forward to Report, which is coming up shortly, we are listening, and no doubt flexibility will be exhibited in response to some of the points that have been raised. I look forward to working with members of the Committee and Members of the House more widely as we seek to make the Bill as good as it can be. On that note, I will sit down for the last time.

None Portrait The Chair
- Hansard -

Before I ask Alex Davies-Jones whether she wishes to press the new clause to a vote, I thank you all for the very respectful way in which you have conducted proceedings. It is much appreciated. Let me say on behalf of Sir Roger and myself that it has been an absolute privilege to co-chair this Bill Committee.

Maria Miller Portrait Dame Maria Miller (Basingstoke) (Con)
- Hansard - - - Excerpts

On a point of order, Ms Rees. On behalf of the Back Benchers, I thank you and Sir Roger for your excellent chairpersonships, and the Minister and shadow Ministers for the very courteous way in which proceedings have taken place. It has been a great pleasure to be a member of the Bill Committee.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I am content with the Minister’s assurance that the provisions of new clause 41 are covered in the Bill, and therefore do not wish to press it to a vote. I beg to ask leave to withdraw the motion.

Clause, by leave, withdrawn.

New Schedule 2

Recovery of OFCOM’s initial costs

Recovery of initial costs

1 (1) This Schedule concerns the recovery by OFCOM of an amount equal to the aggregate of the amounts of WTA receipts which, in accordance with section 401(1) of the Communications Act and OFCOM’s statement under that section, are retained by OFCOM for the purpose of meeting their initial costs.

(2) OFCOM must seek to recover the amount described in sub-paragraph (1) (“the total amount of OFCOM’s initial costs”) by charging providers of regulated services fees under this Schedule (“additional fees”).

(3) In this Schedule—

“initial costs” means the costs incurred by OFCOM before the day on which section 75 comes into force on preparations for the exercise of their online safety functions;

“WTA receipts” means the amounts described in section 401(1)(a) of the Communications Act which are paid to OFCOM (certain receipts under the Wireless Telegraphy Act 2006).

Recovery of initial costs: first phase

2 (1) The first phase of OFCOM’s recovery of their initial costs is to take place over a period of several charging years to be specified in regulations under paragraph 7 (“specified charging years”).

(2) Over that period OFCOM must, in aggregate, charge providers of regulated services additional fees of an amount equal to the total amount of OFCOM’s initial costs.

(3) OFCOM may not charge providers additional fees in respect of any charging year which falls before the first specified charging year.

(4) OFCOM may require a provider to pay an additional fee in respect of a charging year only if the provider is required to pay a fee in respect of that year under section 71 (and references in this Schedule to charging providers are to be read accordingly).

(5) The amount of an additional fee payable by a provider is to be calculated in accordance with regulations under paragraph 7.

Further recovery of initial costs

3 (1) The second phase of OFCOM’s recovery of their initial costs begins after the end of the last of the specified charging years.

(2) As soon as reasonably practicable after the end of the last of the specified charging years, OFCOM must publish a statement specifying—

(a) the amount which is at that time the recoverable amount (see paragraph 6), and

(b) the amounts of the variables involved in the calculation of the recoverable amount.

(3) OFCOM’s statement must also specify the amount which is equal to that portion of the recoverable amount which is not likely to be paid or recovered. The amount so specified is referred to in sub-paragraphs (4) and (5) as “the outstanding amount”.

(4) Unless a determination is made as mentioned in sub-paragraph (5), OFCOM must, in aggregate, charge providers of regulated services additional fees of an amount equal to the outstanding amount.

(5) The Secretary of State may, as soon as reasonably practicable after the publication of OFCOM’s statement, make a determination specifying an amount by which the outstanding amount is to be reduced, and in that case OFCOM must, in aggregate, charge providers of regulated services additional fees of an amount equal to the difference between the outstanding amount and the amount specified in the determination.

(6) Additional fees mentioned in sub-paragraph (4) or (5) must be charged in respect of the charging year immediately following the last of the specified charging years (“year 1”).

(7) The process set out in sub-paragraphs (2) to (6) is to be repeated in successive charging years, applying those sub-paragraphs as if—

(a) in sub-paragraph (2), the reference to the end of the last of the specified charging years were to the end of year 1 (and so on for successive charging years);

(b) in sub-paragraph (6), the reference to year 1 were to the charging year immediately following year 1 (and so on for successive charging years).

(8) Any determination by the Secretary of State under this paragraph must be published in such manner as the Secretary of State considers appropriate.

(9) Sub-paragraphs (4) and (5) of paragraph 2 apply to the charging of additional fees under this paragraph as they apply to the charging of additional fees under that paragraph.

(10) The process set out in this paragraph comes to an end in accordance with paragraph 4.

End of the recovery process

4 (1) The process set out in paragraph 3 comes to an end if a statement by OFCOM under that paragraph records that—

(a) the recoverable amount is nil, or

(b) all of the recoverable amount is likely to be paid or recovered.

(2) Or the Secretary of State may bring that process to an end by making a determination that OFCOM are not to embark on another round of charging providers of regulated services additional fees.

(3) The earliest time when such a determination may be made is after the publication of OFCOM’s first statement under paragraph 3.

(4) A determination under sub-paragraph (2)—

(a) must be made as soon as reasonably practicable after the publication of a statement by OFCOM under paragraph 3;

(b) must be published in such manner as the Secretary of State considers appropriate.

(5) A determination under sub-paragraph (2) does not affect OFCOM’s power—

(a) to bring proceedings for the recovery of the whole or part of an additional fee for which a provider became liable at any time before the determination was made, or

(b) to act in accordance with the procedure set out in section 120 in relation to such a liability.

Providers for part of a year only

5 (1) For the purposes of this Schedule, the “provider” of a regulated service, in relation to a charging year, includes a person who is the provider of the service for part of the year.

(2) Where a person is the provider of a regulated service for part of a charging year only, OFCOM may refund all or part of an additional fee paid to OFCOM under paragraph 2 or 3 by that provider in respect of that year.

Calculation of the recoverable amount

6 For the purposes of a statement by OFCOM under paragraph 3, the “recoverable amount” is given by the formula—

C – (F – R) - D

where—

C is the total amount of OFCOM’s initial costs,

F is the aggregate amount of the additional fees received by OFCOM at the time of the statement in question,

R is the aggregate amount of the additional fees received by OFCOM that at the time of the statement in question have been, or are due to be, refunded (see paragraph 5(2)), and

D is the amount specified in a determination made by the Secretary of State under paragraph 3 (see paragraph 3(5)) at a time before the statement in question or, where more than one such determination has been made, the sum of the amounts specified in those determinations.

If no such determination has been made before the statement in question, D=).

Regulations about recovery of initial costs

7 (1) The Secretary of State must make regulations making such provision as the Secretary of State considers appropriate in connection with the recovery by OFCOM of their initial costs.

(2) The regulations must include provision as set out in sub-paragraphs (3), (4) and (6).

(3) The regulations must specify the total amount of OFCOM’s initial costs.

(4) For the purposes of paragraph 2, the regulations must specify—

(a) the charging years in respect of which additional fees are to be charged, and

(b) the proportion of the total amount of initial costs which OFCOM must seek to recover in each of the specified charging years.

(5) The following rules apply to provision made in accordance with sub-paragraph (4)(a)—

(a) the initial charging year may not be specified;

(b) only consecutive charging years may be specified;

(c) at least three charging years must be specified;

(d) no more than five charging years may be specified.

(6) The regulations must specify the computation model that OFCOM must use to calculate fees payable by individual providers of regulated services under paragraphs 2 and 3 (and that computation model may be different for different charging years).

(7) The regulations may make provision about what OFCOM may or must do if the operation of this Schedule results in them recovering more than the total amount of their initial costs.

(8) The regulations may amend this Schedule or provide for its application with modifications in particular cases.

(9) Before making regulations under this paragraph, the Secretary of State must consult—

(a) OFCOM,

(b) providers of regulated user-to-user services,

(c) providers of regulated search services,

(d) providers of internet services within section 67(2), and

(e) such other persons as the Secretary of State considers appropriate.

Interpretation

8 In this Schedule—

“additional fees” means fees chargeable under this Schedule in respect of the recovery of OFCOM’s initial costs;

“charging year” has the meaning given by section76;

“initial charging year” has the meaning given by section76;

“initial costs” has the meaning given by paragraph 1(3), and the “total amount” of initial costs means the amount described in paragraph 1(1);

“recoverable amount” has the meaning given by paragraph 6;

“specified charging year” means a charging year specified in regulations under paragraph 7 for the purposes of paragraph 2.” —(Chris Philp.)

This new Schedule requires Ofcom to seek to recover their costs which they have incurred (before clause 75 comes into force) when preparing to take on functions as the regulator of services under the Bill by charging fees to providers of services.

Brought up, read the First and Second time, and added to the Bill.

None Portrait The Chair
- Hansard -

New schedule 1 was tabled by Carla Lockhart, who is not on the Committee. Does any Member wish to move new schedule 1? No.

We now come to the final Question in the proceedings. The Committee has finished its work.

Bill, as amended, to be reported.

15:32
Committee rose.
Written evidence reported to the House
OSB89 Mental Health Foundation
OSB90 CEASE UK
OSB91 Amazon UK
OSB92 Demos (supplementary submission)
OSB93 Dave ‘Yardfish’
OSB94 Sam Guinness
OSB95 M. Jenny Edwards, Criminologist and international subject matter expert (SME), Chandler Edwards
OSB96 Domestic Abuse Commissioner
OSB97 The football authorities (Kick It Out, The FA, The Premier League, EFL, Women’s Super League, Women’s Championship, National League, Isthmian League, Southern League, Northern Premier League, Professional Footballers Association, League Managers’ Association, Professional Game Match Officials, and Women in Football) (joint submission)
OSB98 Suzy Lamplugh Trust
OSB99 Liberty
OSB100 Ibrahim Chaudry