Online Safety Bill (Seventeenth sitting) Debate

Full Debate: Read Full Debate
Alex Davies-Jones Portrait Alex Davies-Jones (Pontypridd) (Lab)
- Hansard - - - Excerpts

Before we adjourned, I was discussing the Government’s national artificial intelligence strategy and the two separate consultations launched by the Government to look at the intellectual property system in relation to AI. In those consultations, the Intellectual Property Office recognised that AI

“is playing an increasing role in...artistic creativity.”

However, specific questions about reviewing or enhancing performers’ rights were notably absent from both Government consultations. If the UK Government really want to make Britain a global AI and creative superpower, strengthening the rights of performers and other creatives must be at the heart of the national AI strategy.

Another key challenge is that our intellectual property framework is desperately out of date. Currently, performers have two sets of rights under the Copyright, Designs and Patents Act 1988: the right to consent to the making of a recording of a performance; and the right to control the subsequent use of such recordings, such as the right to make copies. However, as highlighted by Dr Mathilde Pavis, senior lecturer in law at the University of Exeter, AI-made performance synthetisation challenges our intellectual property framework because it reproduces performances without generating a recording or a copy, and therefore falls outside the scope of the Act. An unintended consequence is that people are left vulnerable to abuse and exploitation. Without effective checks and balances put in place by the Government, that will continue. That is why 93% of Equity members responding to a recent survey stated that the Government should introduce a new legal protection for performers, so that a performance cannot be reproduced by AI technology without the performer’s consent.

Advances in AI, including deepfake technology, have reinforced the urgent need to introduce image rights—also known as personality rights or publicity rights. That refers to

“the expression of a personality in the public domain”,

such as an individual’s name, likeness or other personal indicators. Provision of image rights in law enables performers to safeguard meaningful income streams, and to defend their artistic integrity, career choices, brand and reputation. More broadly, for society, it is an important tool for protecting privacy and allowing an individual to object to the use of their image without consent.

In the UK, there is no codified law of image rights or privacy. Instead, we have a patchwork of statutory and common-law causes of action, which an individual can use to protect various aspects of their image and personality. However, none of that is fit for purpose. Legal provision for image rights can be found around the world, so the Government here can and should do more. For example, some American states recognise the right through their statute, and some others through common law. California has both statutory and common-law strains of authority, which protect slightly different forms of the right.

The Celebrities Rights Act of 1985 was passed in California and extended the personality rights for a celebrity to 70 years after their death. In 2020, New York State passed a Bill that recognised rights of publicity for “deceased performers” and “deceased personalities”. Guernsey has created a statutory regime under which image rights can be registered. The legislation centres on the legal concept of a “personnage”— the person or character behind a personality that is registered. The image right becomes a property right capable of protection under the legislation through registration, which enables the image right to be protected, licensed and assigned.

The Minister will know that Equity is doing incredible work to highlight the genuine impact that this type of technology is having on our creative industry and our performers. He must therefore see the sense in our new clause, which would require the Government at least to consider the matter of synthetic media content, which thus far they have utterly failed to do.

Chris Philp Portrait The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport (Chris Philp)
- Hansard - -

It is a pleasure to serve under your chairmanship again, Ms Rees. I thank the shadow Minister, the hon. Member for Pontypridd, for raising the issues that she has done about synthetic and digitally manipulated content, which we are very conscious of. We are conscious of the risk of harm to those who work in the entertainment industry and of course, in particular, to victims of deepfake pornography.

We take intellectual property infringement extremely seriously. The Government have recently published a counter-infringement strategy, setting out a range of steps that we intend to take to strengthen the whole system approach to tackling infringement of intellectual property rights. It is widely acknowledged that the United Kingdom has an intellectual property framework that is genuinely world leading and considered among the best in the world. That includes strong protections for performers’ rights. We intend that to continue. However, we are not complacent and the law is kept under review, not least via the counter-infringement strategy I mentioned a moment ago.

Harmful synthetic media content, including the deepfakes that the hon. Member for Pontypridd mentioned, is robustly addressed by the safety duties set out in the Bill in relation to illegal content—much deepfake content, if it involves creating an image of someone, would be illegal—as well as content that could be harmful to children and content that will be on the “legal but harmful” adult list. Those duties will tackle the most serious and illegal forms of deepfake and will rightly cover certain threats that undermine our democracy. For example, a manipulated media image that contained incitement to violence, such as a deepfake of a politician telling people to attack poll workers because they are rigging an election, would obviously already fall foul of the Bill under the illegal duties.

In terms of reporting and codes of practice, the Bill already requires Ofcom to produce codes of practice setting out the ways in which providers can take steps to reduce the harm arising from illegal and harmful content, which could include synthetic media content such as deepfakes where those contain illegal content.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

The Minister uses the example of a deepfake of a politician inciting people to attack poll workers during an election. Given some of the technology is so advanced that it is really difficult to spot when the deepfakes actually occur, could it be argued that Ofcom as regulator or even the platforms themselves would be adverse to removing or reporting the content as it could fall foul of the democratic content exemption in the Bill?

Chris Philp Portrait Chris Philp
- Hansard - -

The democratic content protection that the shadow Minister refers to, in clause 15, is not an exemption; it is a duty to take into account content of democratic importance. That is on line 34 of page 14. When making a decision, it has to be taken into account—it is not determinative; it is not as if a politician or somebody involved in an election gets a free pass to say whatever they like, even if it is illegal, and escapes the provisions of the Bill entirely. The platform simply has to take it into account. If it was a deepfake image that was saying such a thing, the balancing consideration in clause 15 would not even apply, because the protection applies to content of democratic importance, not to content being produced by a fake image of a politician.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It is important that we get this right. One of our concerns on clause 15, which we have previously discussed, relates to this discussion of deepfakes, particularly of politicians, and timeframes. I understand the Minister’s point on illegal content. If there is a deepfake of a politician—on the eve of poll, for example—widely spreading disinformation or misinformation on a platform, how can the Minister confidently say that that would be taken seriously, in a timely manner? That could have direct implications on a poll or an election. Would the social media companies have the confidence to take that content down, given clause 15?

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

The protections in clause 15—they are not exemptions—would only apply to content that is of bona fide, genuine democratic importance. Obviously, a deepfake of a politician would not count as genuine, democratic content, because it is fake. If it was a real politician, such as the hon. Lady, it would benefit from that consideration. If it was a fake, it would not, because it would not be genuine content of democratic importance.

It is also worth saying that if—well, I hope when—our work with the Law Commission to review the criminal law related to the non-consensual taking and sharing of internet images is taken forward, that will then flow into the duties in the Bill. Deepfakes of internet images are rightly a concern of many people. That work would fall into the ambit of the Bill, either via clause 52, which points to illegal acts where there is an individual victim, or schedule 7, if a new internet image abuse were added to schedule 7 as a priority offence. There are a number of ways in which deepfakes could fall into the ambit of the Bill, including if they relate to extreme pornography.

The new clause would require the production of a report, not a change to the substantive duties in the Bill. It is worth saying that the Bill already provides Ofcom with powers to produce and publish reports regarding online safety matters. Those powers are set out in clause 137. The Bill will ensure that Ofcom has access to the information required to prepare those reports, including information from providers about the harm caused by deepfakes and how companies tackle the issue. We debated that extensively this morning when we talked about the strong powers that already exist under clause 85.

The hon. Lady has raised important points about intellectual property, and I have pointed to our counter-infringement strategy. She raised important points about deepfakes both in a political context and in the context of especially intimate images being generated by AI. I hope I have set out how the Bill addresses concerns in those areas. The Bill as drafted addresses those important issues in a way that is certainly adequate.

Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

I welcome the Minister’s comments and I am grateful for his reassurance on some of the concerns that were raised. At this stage we will not press the matter to a vote. I beg to ask leave to withdraw the motion.

Clause, by leave, withdrawn.

New Clause 27

OFCOM: power to impose duties on regulated services

“OFCOM: power to impose duties on regulated services

(1) OFCOM may carry out an assessment of the risk of harm posed by any regulated service.

(2) Where OFCOM assess a service to pose a very high risk of harm, OFCOM may, notwithstanding the categorisation of the service or the number or profile of its users, impose upon the service duties equivalent to—

(a) the children’s risk assessment duties set out in sections 10 and 25 of this Act; and

(b) the safety duties protecting children set out in sections 11 and 26 of this Act.”—(Kirsty Blackman.)

This new clause enables Ofcom to impose on any regulated service duties equivalent to the children’s risk assessment duties and the safety duties protecting children.

Brought up, and read the First time.

--- Later in debate ---
Chris Philp Portrait Chris Philp
- Hansard - -

I thank the hon. Member for Aberdeen North for raising those considerations, because protecting children is clearly one of the most important things that the Bill will do. The first point that it is worth drawing to the Committee’s attention again is the fact that all companies, regardless of the number of child users they may have, including zero child users, have duties to address illegal content where it affects children. That includes child sexual exploitation and abuse content, and illegal suicide content. Those protections for the things that would concern us the most—those illegal things—apply to companies regardless of their size. It is important to keep that in mind as we consider those questions.

It is also worth keeping in mind that we have designed the provisions in clause 31 to be a bit flexible. The child user condition, which is in clause 31(3) on page 31 of the Bill, sets out that one of two tests must be met for the child user condition to be met. The condition is met if

“there is a significant number of children who are users of the service…or…the service…is of a kind likely to attract a significant number of users who are children.”

When we debated the issue previously, we clarified that the word “user” did not mean that they had to be a registered user; they could be somebody who just stumbles across it by accident or who goes to it intentionally, but without actually registering. We have built in a certain amount of flexibility through the word “likely”. That helps a little bit. We expect that where a service poses a very high risk of harm to children, it is likely to meet the test, as children could be attracted to it—it might meet the “likely to attract” test.

New clause 27 would introduce the possibility that even when there were no children on the service and no children were ever likely to use it, the duties would be engaged—these duties are obviously in relation to content that is not illegal; the illegal stuff is covered already elsewhere. There is a question about proportionality that we should bear in mind as we think about this. I will be resisting the new clause on that basis.

However, as the hon. Member for Aberdeen North said, I have hinted or more than hinted to the Committee previously that we have heard the point that has been made—it was made in the context of adults, but applies equally to children here—that there is a category of sites that might have small numbers of users but none the less pose a high risk of harm, not harm that is illegal, because the “illegal” provision applies to everybody already, but harm that falls below the threshold of illegality. On that area, we heard hon. Members’ comments on Second Reading. We have heard what members of the Committee have had to say on that topic as well. I hope that if I say that that is something that we are reflecting on very carefully, the hon. Member for Aberdeen North will understand that those comments have been loudly heard by the Government. I hope that I have explained why I do not think new clause 27 quite works, but the point is understood.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

I appreciate the Minister’s comments, but in the drafting of the new clause, we have said that Ofcom “may” impose these duties. I would trust the regulator enough not to impose the child safety duties on a site that literally has no children on it and that children have no ability to access. I would give the regulator greater credit than the Minister did, perhaps accidentally, in his comments. If it were up to Ofcom to make that decision and it had the power to do so where it deemed that appropriate, it would be most appropriate for the regulator to have the duty to make the decision.

I wish to press the new clause to a Division.

Question put, That the clause be read a Second time.

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

Labour argued in favour of greater empowerment provisions for children during the debate on new clause 3, which would have brought in a user advocacy body for children. YoungMinds has pointed out that many young people are unaware of the Bill, and there has been little engagement with children regarding its design. I am sure members of the Committee would agree that the complexity of the Bill is evidence enough of that.

New clause 28 would make the online world more accessible for children and increase their control over the content they see. We know that many children use category 1 services, so they should be entitled to the same control over harmful content as adults. As such, Labour supports the new clause.

Chris Philp Portrait Chris Philp
- Hansard - -

I thank the hon. Member for Aberdeen North for her, as ever, thoughtful comments on the new clause. She has already referred to the user empowerment duties for adults set out in clause 57, and is right to say that those apply only to adults, as is made clear in the very first line of subsection (1) near the bottom of page 52.

As always, the hon. Lady’s analysis of the Bill is correct: the aim of those empowerment duties is to give adults more control over the content they see and the people with whom they interact online. One of the reasons why those empowerment duties have been crafted specifically for adults is that, as we discussed in a freedom of expression context, the Bill does not ultimately censor free speech regarding content that is legal but potentially harmful. Platforms can continue to display that information if their policies allow, so we felt it was right to give adults more choice over whose content they see, given that it could include content that is harmful but falls on the right side of the legal threshold.

As Members would expect, the provisions of the Bill in relation to children are very difficult to the provisions for adults. There are already specific provisions in the Bill that relate to children, requiring all social media companies whose platforms are likely to be accessed by children—not just the big ones—to undertake comprehensive risk assessments and protect children from any kind of harmful activity. If we refer to the children’s risk assessment duties in clause 10, and specifically clause 10(6)(e), we see that those risk assessments include an assessment looking at the content that children will encounter and—critically—who they might encounter online, including adults.

To cut to the chase and explain why user empowerment has been applied to adults but not children, the view was taken that children are already protected a lot more than adults through the child risk assessment duties and child safety duties. Therefore, they do not need the user empowerment provisions because they are already—all of them, regardless of whether they choose to be verified or not—being protected from harmful content already by the much stronger provisions in the Bill relating to children. That is why it was crafted as it is.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

It does make sense, and I do understand what the Minister is talking about in relation to clause 10 and the subsections that he mentioned. However, that only sets out what the platforms must take into account in their child risk assessments.

If we are talking about 15-year-olds, they are empowered in their lives to make many decisions on their own behalf, as well as decisions guided by parents or parental decisions taken for them. We are again doing our children a disservice by failing to allow young people the ability to opt out—the ability to choose not to receive certain content. Having a requirement to include whether not these functionalities exist in a risk assessment is very different from giving children and young people the option to choose, and to decide what they do—and especially do not—want to see on whichever platform they are interacting on.

I have previously mentioned the fact that if a young person is on Roblox, or some of those other platforms, it is difficult for them to interact only with people who are on their friends list. It is difficult for that young person to exclude adult users from contacting them. A lot of young people want to exclude content, comments or voice messages from people they do not know. They want to go on the internet and have fun and enjoy themselves without the risk of being sent an inappropriate message or photo and having to deal with those things. If they could choose those empowerment functions, that just eliminates the risk and they can make that choice.

Chris Philp Portrait Chris Philp
- Hansard - -

Could I develop the point I was making earlier on how the Bill currently protects children? Clause 11, which is on page 10, is on safety duties for children—what the companies have to do to protect children. One thing that they may be required by Ofcom to do, as mentioned in subsection (4)(f), is create

“functionalities allowing for control over content that is encountered, especially by children”.

Therefore, there is a facility to require the platforms to create the kind of functionalities that relate actually, as that subsection is drafted, to not just identity but the kind of content being displayed. Does that go some way towards addressing the hon. Lady’s concern?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

That is very helpful. I am glad that the Minister is making clear that he thinks that Ofcom will not just be ignoring this issue because the Bill is written to allow user empowerment functions only for adults.

I hope the fact that the Minister kindly raised clause 11(4) will mean that people can its importance, and that Ofcom will understand it should give consideration to it, because that list of things could have just been lost in the morass of the many, many lists of things in the Bill. I am hoping that the Minister’s comments will go some way on that. Notwithstanding that, I will press the new clause to a vote.

Question put, That the clause be read a Second time.

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

I rise to speak in support of the new clause. Too often people with learning disabilities are left out of discussions about provisions relevant to them. People with learning disabilities are disproportionately affected by online harms and can receive awful abuse online.

At the same time, Mencap has argued that social media platforms enable people with learning disabilities to develop positive friendships and relationships. It is therefore even more important that people with learning disabilities do not lose out on the features described in clause 14, which allow them to control the content to which they are exposed. It is welcome that clauses 17, 18, 27 and 28 specify that reporting and complaints procedures must be easy to access and use.

The Bill, however, should go further to ensure that the duties on complaints and reporting explicitly cater to adults with learning disabilities. In the case of clause 14 on user empowerment functions, it must be made much clearer that those functions are easy to access and use. The new clause would be an important step towards ensuring that the Bill benefits everyone who experiences harms online, including people with learning disabilities. Labour supports the new clause.

Chris Philp Portrait Chris Philp
- Hansard - -

I thank the hon. Member for Aberdeen North once again for the thoughtfulness with which she has moved her new clause. To speak first to the existing references to accessibility in the Bill, let me start with user empowerment in clause 14.

Clause 14(4) makes it clear that the features included in “a service in compliance” with the duty in this clause must be made available to all adult users. I stress “all” because, by definition, that includes people with learning disabilities or others with characteristics that mean they may require assistance. When it comes to content reporting duties, clause 17(2)—line 6 of page 17—states that it has to be easy for any “affected persons” to report the content. They may be people who are disabled or have a learning difficulty or anything else. Clause 17(6)(d) further makes it clear that adults who are “providing assistance” to another adult are able to raise content reporting issues.

There are references in the Bill to being easy to report and to one adult assisting another. Furthermore, clause 18(2)(c), on page 18, states that the complaints system has to be

“easy to use (including by children)”.

It also makes it clear through the definition of “affected person”, which we have spoken about, that an adult assisting another adult is allowed to make a complaint on behalf of the second adult. Those things have been built into the structure of the Bill.

Furthermore, to answer the question from the hon. Member for Aberdeen North, I am happy to put on record that Ofcom, as a public body, is subject to the public sector equality duty, so by law it must take into account the ways in which people with certain characteristics, such as learning disabilities, may be impacted when performing its duties, including writing the codes of practice for user empowerment, redress and complaints duties. I can confirm, as the hon. Member requested, that Ofcom, when drafting its codes of practice, will have to take accessibility into account. It is not just a question of my confirming that to the Committee; it is a statutory duty under the Equality Act 2010 and the public sector equality duty that flows from it.

I hope that the words of the Bill, combined with that statutory public sector equality duty, make it clear that the objectives of new clause 29 are met.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

The Minister mentioned learning difficulties. That is not what we are talking about. Learning difficulties are things such as dyslexia and attention deficit hyperactivity disorder. Learning disabilities are lifelong intellectual impairments and very different things—that is what we are talking about.

Chris Philp Portrait Chris Philp
- Hansard - -

I am very happy to accept the shadow Minister’s clarification. The way that clauses 14, 17 and 18 are drafted, and the public sector equality duty, include the groups of people she referred to, but I am happy to acknowledge and accept her clarification.

Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

That is fine, but I have a further point to make. The new clause would be very important to all those people who support people with learning disabilities. So much of the services that people use do not take account of people’s learning disabilities. I have done a huge amount of work to try to support people with learning disabilities over the years. This is a very important issue to me.

There are all kinds of good examples, such as easy-read versions of documents, but the Minister said when batting back this important new clause that the expression “all adult users” includes people with learning disabilities. That is not the case. He may not have worked with a lot of people with learning disabilities, but they are excluded from an awful lot. That is why I support making that clear in the Bill.

We on the Opposition Benches say repeatedly that some things are not included by an all-encompassing grouping. That is certainly the case here. Some things need to be said for themselves, such as violence against women and girls. That is why this is an excellent new clause that we support.

--- Later in debate ---
Alex Davies-Jones Portrait Alex Davies-Jones
- Hansard - - - Excerpts

It is an honour to support the new clause moved by the hon. Member for Aberdeen North. This was a recommendation from the Joint Committee report, and we believe it is important, given the sheer complexity of the Bill. The Minister will not be alarmed to hear that I am all in favour of increasing the scrutiny and transparency of this legislation.

Having proudly served on the DCMS Committee, I know it does some excellent work on a very broad range of policy areas, as has been highlighted. It is important to acknowledge that there will of course be cross-over, but ultimately we support the new clause. Given my very fond memories of serving on the Select Committee, I want to put on the record my support for it. My support for this new clause is not meant as any disrespect to that Committee. It is genuinely extremely effective in scrutinising the Government and holding them to account, and I know it will continue to do that in relation to both this Bill and other aspects of DCMS. The need for transparency, openness and scrutiny of this Bill is fundamental if it is truly to be world-leading, which is why we support the new clause.

Chris Philp Portrait Chris Philp
- Hansard - -

I am grateful for the opportunity to discuss this issue once again. I want to put on the record my thanks to the Joint Committee, which the hon. Member for Ochil and South Perthshire sat on, for doing such fantastic work in scrutinising the draft legislation. As a result of its work, no fewer than 66 changes were made to the Bill, so it was very effective.

I want to make one or two observations about scrutinising the legislation following the passage of the Bill. First, there is the standard review mechanism in clause 149, on pages 125 and 126, which provides for a statutory review not before two years and not after five years of the Bill receiving Royal Assent.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

On that review function, it would help if the Minister could explain a bit more why it was decided to do that as a one-off, and not on a rolling two-year basis, for example.

Chris Philp Portrait Chris Philp
- Hansard - -

That is a fairly standard clause in legislation. Clearly, for most legislation and most areas of Government activity, the relevant departmental Select Committee would be expected to provide the ongoing scrutiny, so ordinarily the DCMS Committee would do that. I hear the shadow Minister’s comments: she said that this proposal is not designed in any way to impugn or disrespect that Committee, but I listened to the comments of the Chair of that Committee on Second Reading, and I am not sure he entirely shares that view—he expressed himself in quite forthright terms.

On the proposal, we understand that the Joint Committee did valuable work. This is an unusual piece of legislation, in that it is completely groundbreaking. It is unlike any other, so the case for a having a particular Committee look at it may have some merits. I am not in a position to give a definitive Government response to that because the matter is still under consideration, but if we were to establish a special Committee to look at a single piece of legislation, there are two ways to do it. It could either be done in statute, as the new clause seeks, or it could be done by Standing Orders.

Generally speaking, it is the practice of the House to establish Committees by Standing Orders of the House rather than by statute. In fact, I think the only current Committee of the House established by statute—Ms Rees, you will correct me if I am wrong, as you are more of an expert on these matters than me—is the Intelligence and Security Committee, which was established by the Intelligence Services Act 1994. That is obviously very unusual, because it has special powers. It looks into material that would ordinarily be classified as secret, and it has access to the intelligence services. It is a rather unusual Committee that has to be granted special powers because it looks into intelligence and security matters. Clearly, those considerations do not apply here. Were a particular Committee to be established, the right way of doing that would not be in statute, as the new clause proposes, but via the Standing Orders of the House, if that is something that Parliament wants to do.

Dean Russell Portrait Dean Russell (Watford) (Con)
- Hansard - - - Excerpts

As another member of the Joint Committee, I totally understand the reasoning. I want to put on record my support for setting up a Committee through the approach the Minister mentioned using statutory instruments. I will not support the new clause but I strongly support the Joint Committee continuing in some form to enable scrutiny. When we look forward to the metaverse, virtual reality and all the things that are coming, it is important that that scrutiny continues. No offence to Opposition colleagues, but I do not think the new clause is the right way to do that. However, the subject is worth further exploration, and I would be very supportive of that happening.

Chris Philp Portrait Chris Philp
- Hansard - -

First, let me also put on record my thanks to my hon. Friend for his service on the Joint Committee. He did a fantastic job and, as I said, the Committee’s recommendations have been powerfully heard. I thank him for his acknowledgment that if one were to do this, the right way to do it would be through Standing Orders. I have heard the point he made in support of some sort of ongoing special committee. As I say, the Government have not reached a view on this, but if one were to do that, I agree with my hon. Friend that Standing Orders would be the right mechanism.

One of the reasons for that can be found in the way the new clause has been drafted. Subsections (5) and (6) say:

“The membership and Chair of the Committee shall be appointed by regulations made by the Secretary of State…the tenure of office of members of, the procedure of and other matters…shall be set out in regulations made by the Secretary of State.”

I know those regulations are then subject to approval by a resolution of the House, but given the reservations expressed by Opposition Members about powers for the Secretary of State over the last eight sitting days, it is surprising to see the new clause handing the Secretary of State—in the form of a regulation-making power—the power to form the Committee.

That underlines why doing this through Standing Orders, so that the matter is in the hands of the whole House, is the right way to proceed, if that is something we collectively wish to do. For that reason, we will not support the new clause. Obviously, we will get back to the House in due course once thinking has been done about potential Committees, but that can be done as a separate process to the legislation. In any case, post-legislative scrutiny will not be needed until the regime is up and running, which will be after Royal Assent, so that does not have enormous time pressure on it.

A comment was made about future-proofing the Bill and making sure it stays up to date. There is a lot in that, and we need to make sure we keep up to date with changing technologies, but the Bill is designed to be tech agnostic, so if there is change in technology, that is accommodated by the Bill because the duties are not specific to any given technology. A good example is the metaverse. That was not conceived or invented prior to the Bill being drafted; none the less, it is captured by the Bill. The architecture of the Bill, relying on codes of practice produced by Ofcom, is designed to ensure flexibility so that the codes of practice can be kept up to date. I just wanted to make those two points in passing, as the issue was raised by the hon. Member for Aberdeen North.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

The reason the new clause is drafted in that way is because I wanted to recognise the work of the Joint Committee and to take on board its recommendations. If it had been entirely my drafting, the House of Lords would certainly not have been involved, given that I am not the biggest fan of the House of Lords, as its Members are not elected. However, the decision was made to submit the new clause as drafted.

The Minister has said that the Government have not come to a settled view yet, which I am taking as the Minister not saying no. He is not standing up and saying, “No, we will definitely not have a Standing Committee.” I am not suggesting he is saying yes, but given that he is not saying no, I am happy to withdraw the new clause. If the Minister is keen to come forward at a future stage with suggestions for changes to Standing Orders, which I understand have to be introduced by the Leader of the House or the Cabinet Office, then they would be gladly heard on this side of the House. I beg to ask leave to withdraw the motion.

Clause, by leave, withdrawn.

New Clause 38

Adults’ risk assessment duties

“(1) This section sets out duties which apply in relation to internet services within section 67(2).

(2) A duty to take appropriate steps to keep an adults’ risk assessment up to date, including when OFCOM makes any significant change to a risk profile that relates to services of the kind in question.

(3) Before making any significant change to any aspect of a service’s design or operation, a duty to carry out a further suitable and sufficient adults’ risk assessment relating to the impacts of that proposed change.

(4) A duty to make and keep a written record, in an easily understandable form, of every risk assessment under subsections (2) and (3).

(5) An “adults’ risk assessment” of a service of a particular kind means an assessment of the following matters, taking into account the risk profile that relates to services of that kind—

(a) the user base;

(b) the level of risk of adults who are users of the service encountering, by means of the service, each kind of priority content that is harmful to adults (with each kind separately assessed).

(6) An “adults’ risk assessment” of a service of a particular kind means an assessment of the following matters, taking into account the risk profile that relates to services of that kind—

(a) the user base;

(b) the level of risk of adults who are users of the service encountering, by means of the service, each kind of priority content that is harmful to adults (with each kind separately assessed), taking into account (in particular) algorithms used by the service, and how easily, quickly and widely content may be disseminated by means of the service;

(c) the level of risk of harm to adults presented by different kinds of priority content that is harmful to adults;

(d) the level of risk of harm to adults presented by priority content that is harmful to adults which particularly affects individuals with a certain characteristic or members of a certain group;

(e) the level of risk of functionalities of the service facilitating the presence or dissemination of priority content that is harmful to adults, identifying and assessing those functionalities that present higher levels of risk;

(f) the different ways in which the service is used, and the impact of such use on the level of risk of harm that might be suffered by adults;

(g) the nature, and severity, of the harm that might be suffered by adults from the matters identified in accordance with paragraphs (b) to (f);

(h) how the design and operation of the service (including the business model, governance, use of proactive technology, measures to promote users’ media literacy and safe use of the service, and other systems and processes) may reduce or increase the risks identified.

(7) In this section references to risk profiles are to the risk profiles for the time being published under section 83 which relate to the risk of harm to adults presented by priority content that is harmful to adults.

(8) The provisions of Schedule 3 apply to any assessment carried out under this section in the same way they apply to any relating to a Part 3 service.”—(John Nicolson.)

This new clause applies adults’ risk assessment duties to pornographic sites.

Brought up, and read the First time.

--- Later in debate ---
Barbara Keeley Portrait Barbara Keeley
- Hansard - - - Excerpts

As we heard from the hon. Member for Ochil and South Perthshire, new clauses 38 to 40 would align the duties on pornographic content so that both user-to-user sites and published pornography sites are subject to robust duties that are relevant to the service. Charities have expressed concerns that many pornography sites might slip through the net because their content does not fall under the definition of “pornographic content” in clause 66. The new clauses aim to address that. They are based on the duties placed on category 1 services, but they recognise the unique harms that can be caused by pornographic content providers, some of which the hon. Member graphically described with the titles that he gave. The new clauses also contain some important new duties that are not currently in the Bill, including the transparency arrangements in new clause 39 and important safeguards in new clause 40.

The Opposition have argued time and again for publishing duties when it comes to risk assessments. New clause 39 would introduce a duty to summarise in the terms of service the findings of the most recent adult risk assessments of a service. That is an important step towards making risk assessments publicly accessible, although Labour’s preference would be for them to be published publicly and in full, as I argued in the debate on new clause 9, which addressed category 1 service risk assessments.

New clause 40 would introduce measures to prevent the upload of illegal content, such as by allowing content uploads only from verified content providers, and by requiring all uploaded content to be reviewed. If the latter duty were accepted, there would need to be proper training and support for any human content moderators. We have heard during previous debates about the awful circumstances of human content moderators. They are put under such pressure for that low-paid work, and we do not want to encourage that.

New clause 40 would also provide protections for those featured in such content, including the need for written consent and identity and age verification. Those are important safeguards that the Labour party supports. I hope the Minister will consider them.

Chris Philp Portrait Chris Philp
- Hansard - -

I thank the hon. Member for Ochil and South Perthshire for raising these issues with the Committee. It is important first to make it clear that websites providing user-to-user services are covered in part 3 of the Bill, under which they are obliged to protect children and prevent illegal content, including some forms of extreme pornography, from circulating. Such websites are also obliged to prevent children from accessing those services. For user-to-user sites, those matters are all comprehensively covered in part 3.

New clauses 38, 39 and 40 seek to widen the scope of part 5 of the Bill, which applies specifically to commercial pornography sites. Those are a different part of the market. Part 5 is designed to close a loophole in the original draft of the Bill that was identified by the Joint Committee, on which the hon. Member for Ochil and South Perthshire and my hon. Friend the Member for Watford served. Protecting children from pornographic content on commercial porn sites had been wrongly omitted from the original draft of the Bill. Part 5 of the Bill as currently drafted is designed to remedy that oversight. That is why the duties in part 5 are narrowly targeted at protecting children in the commercial part of the market.

A much wider range of duties is placed by part 3 on the user-to-user part of the pornography market. The user-to-user services covered by part 3 are likely to include the largest sites with the least control; as the content is user generated, there is no organising mind—whatever gets put up, gets put up. It is worth drawing the distinction between the services covered in part 3 and part 5 of the Bill.

In relation to part 5 services publishing their own material, Parliament can legislate, if it chooses to, to make some of that content illegal, as it has done in some areas—some forms of extreme pornography are illegal. If Parliament thinks that the line is drawn in the wrong place and need to be moved, it can legislate to move that line as part of the general legislation in this area.

I emphasise most strongly that user-to-user sites, which are probably what the hon. Member for Ochil and South Perthshire was mostly referring to, are comprehensively covered by the duties in part 3. The purpose of part 5, which was a response to the Joint Committee’s report, is simply to stop children viewing such content. That is why the Bill has been constructed as it has.

Question put, That the clause be read a Second time.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

Seeing as we are not doing spurious points of order, I will also take the opportunity to express our thanks. The first one is to the Chairs: thank you very much, Ms Rees and Sir Roger, for the excellent work you have done in the Chair. This has been a very long Bill, and the fact that you have put up with us for so long has been very much appreciated.

I thank all the MPs on the Committee, particularly the Labour Front-Bench team and those who have been speaking for the Labour party. They have been very passionate and have tabled really helpful amendments—it has been very good to work with the Labour team on the amendments that we have put together, particularly the ones we have managed to agree on, which is the vast majority. We thank Matt Miller, who works for my hon. Friend the Member for Ochil and South Perthshire. He has been absolutely wonderful. He has done an outstanding amount of work on the Bill, and the amazing support that he has given us has been greatly appreciated. I also thank the Public Bill Office, especially for putting up with the many, many amendments we submitted, and for giving us a huge amount of advice on them.

Lastly, I thank the hundreds of organisations that got in touch with us, and the many people who took the time to scrutinise the Bill, raise their concerns, and bring those concerns to us. Of those hundreds of people and organisations, I particularly highlight the work of the National Society for the Prevention of Cruelty to Children. Its staff have been really helpful to work with, and I have very much appreciated their advice and support in drafting our amendments.

Chris Philp Portrait Chris Philp
- Hansard - -

I feel slightly out of place, but I will add some concluding remarks in a moment; I should probably first respond to the substance of the new clause. The power to co-operate with other regulators and share information is, of course, important, but I am pleased to confirm that it is already in the Bill—it is not the first time that I have said that, is it?

Clause 98 amends section 393(2)(a) of the Communications Act 2003. That allows Ofcom to disclose information and co-operate with other regulators. Our amendment will widen the scope of the provision to include carrying out the functions set out in the Bill.

The list of organisations with which Ofcom can share information includes a number of UK regulators—the Competition and Markets Authority, the Information Commissioner, the Financial Conduct Authority and the Payment Systems Regulator—but that list can be amended, via secondary legislation, if it becomes necessary to add further organisations. In the extremely unlikely event that anybody wants to look it up, that power is set out in subsections (3)(i) and (4)(c) of section 393 of the Communications Act 2003. As the power is already created by clause 98, I hope that we will not need to vote on new clause 41.

I echo the comments of the shadow Minister about the Digital Regulation Cooperation Forum. It is a non-statutory body, but it is extremely important that regulators in the digital arena co-operate with one another and co-ordinate their activities. I am sure that we all strongly encourage the relevant regulators to work with the DRCF and to co-operate in this and adjacent fields.

I will bring my remarks to a close with one or two words of thanks. Let me start by thanking Committee members for their patience and dedication over the nine days we have been sitting—50-odd hours in total. I think it is fair to say that we have given the Bill thorough consideration, and of course there is more to come on Report, and that is before we even get to the House of Lords. This is the sixth Bill that I have taken through Committee as Minister, and it is by far the most complicated and comprehensive, running to 194 clauses and 15 schedules, across 213 pages. It has certainly been a labour. Given its complexity, the level of scrutiny it has received has been impressive—sometimes onerous, from my point of view.

The prize for the most perceptive observation during our proceedings definitely goes to the hon. Member for Aberdeen North, who noticed an inconsistency between use of the word “aural” in clause 49 and “oral” in clause 189, about 120 pages later.

I certainly thank our fantastic Chairs, Sir Roger Gale and Ms Rees, who have chaired our proceedings magnificently and kept us in order, and even allowed us to finish a little early, so huge thanks to them. I also thank the Committee Clerks for running everything so smoothly and efficiently, the Hansard reporters for deciphering our sometimes near-indecipherable utterances, and the Officers of the House for keeping our sittings running smoothly and safely.

I also thank all those stakeholders who have offered us their opinions; I suspect that they will continue to do so during the rest of the passage of the Bill. Their engagement has been important and very welcome. It has really brought external views into Parliament, which is really important.

I conclude by thanking the people who have been working on the Bill the longest and hardest: the civil servants in the Department for Digital, Culture, Media and Sport. Some members of the team have been working on the Bill in its various forms, including White Papers and so on, for as long as five years. The Bill has had a long gestation. Over the last few months, as we have been updating the Bill, rushing to introduce it, and perhaps even preparing some amendments for Report, they have been working incredibly hard, so I give a huge thanks to Sarah Connolly and the whole team at DCMS for all their incredible work.

Finally, as we look forward to Report, which is coming up shortly, we are listening, and no doubt flexibility will be exhibited in response to some of the points that have been raised. I look forward to working with members of the Committee and Members of the House more widely as we seek to make the Bill as good as it can be. On that note, I will sit down for the last time.

None Portrait The Chair
- Hansard -

Before I ask Alex Davies-Jones whether she wishes to press the new clause to a vote, I thank you all for the very respectful way in which you have conducted proceedings. It is much appreciated. Let me say on behalf of Sir Roger and myself that it has been an absolute privilege to co-chair this Bill Committee.